00:00:00.000 Started by upstream project "autotest-per-patch" build number 124231 00:00:00.000 originally caused by: 00:00:00.001 Started by user sys_sgci 00:00:00.019 Checking out git https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool into /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4 to read jbp/jenkins/jjb-config/jobs/autotest-downstream/autotest-phy.groovy 00:00:00.020 The recommended git tool is: git 00:00:00.021 using credential 00000000-0000-0000-0000-000000000002 00:00:00.023 > git rev-parse --resolve-git-dir /var/jenkins_home/workspace/crypto-phy-autotest_script/33b20b30f0a51e6b52980845e0f6aa336787973ad45e341fbbf98d1b65b265d4/jbp/.git # timeout=10 00:00:00.038 Fetching changes from the remote Git repository 00:00:00.040 > git config remote.origin.url https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool # timeout=10 00:00:00.057 Using shallow fetch with depth 1 00:00:00.057 Fetching upstream changes from https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool 00:00:00.057 > git --version # timeout=10 00:00:00.078 > git --version # 'git version 2.39.2' 00:00:00.078 using GIT_ASKPASS to set credentials SPDKCI HTTPS Credentials 00:00:00.121 Setting http proxy: proxy-dmz.intel.com:911 00:00:00.121 > git fetch --tags --force --progress --depth=1 -- https://review.spdk.io/gerrit/a/build_pool/jenkins_build_pool refs/heads/master # timeout=5 00:00:02.456 > git rev-parse origin/FETCH_HEAD^{commit} # timeout=10 00:00:02.470 > git rev-parse FETCH_HEAD^{commit} # timeout=10 00:00:02.483 Checking out Revision 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 (FETCH_HEAD) 00:00:02.483 > git config core.sparsecheckout # timeout=10 00:00:02.495 > git read-tree -mu HEAD # timeout=10 00:00:02.512 > git checkout -f 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=5 00:00:02.530 Commit message: "pool: fixes for VisualBuild class" 00:00:02.530 > git rev-list --no-walk 9bbc799d7020f50509d938dbe97dc05da0c1b5c3 # timeout=10 00:00:02.814 [Pipeline] Start of Pipeline 00:00:02.827 [Pipeline] library 00:00:02.828 Loading library shm_lib@master 00:00:02.829 Library shm_lib@master is cached. Copying from home. 00:00:02.846 [Pipeline] node 00:00:02.856 Running on WFP3 in /var/jenkins/workspace/crypto-phy-autotest 00:00:02.857 [Pipeline] { 00:00:02.868 [Pipeline] catchError 00:00:02.869 [Pipeline] { 00:00:02.885 [Pipeline] wrap 00:00:02.895 [Pipeline] { 00:00:02.901 [Pipeline] stage 00:00:02.902 [Pipeline] { (Prologue) 00:00:03.080 [Pipeline] sh 00:00:03.363 + logger -p user.info -t JENKINS-CI 00:00:03.383 [Pipeline] echo 00:00:03.385 Node: WFP3 00:00:03.393 [Pipeline] sh 00:00:03.693 [Pipeline] setCustomBuildProperty 00:00:03.704 [Pipeline] echo 00:00:03.705 Cleanup processes 00:00:03.709 [Pipeline] sh 00:00:03.985 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.985 2456935 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:03.997 [Pipeline] sh 00:00:04.281 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:04.281 ++ grep -v 'sudo pgrep' 00:00:04.281 ++ awk '{print $1}' 00:00:04.281 + sudo kill -9 00:00:04.281 + true 00:00:04.296 [Pipeline] cleanWs 00:00:04.306 [WS-CLEANUP] Deleting project workspace... 00:00:04.306 [WS-CLEANUP] Deferred wipeout is used... 00:00:04.312 [WS-CLEANUP] done 00:00:04.317 [Pipeline] setCustomBuildProperty 00:00:04.330 [Pipeline] sh 00:00:04.611 + sudo git config --global --replace-all safe.directory '*' 00:00:04.683 [Pipeline] nodesByLabel 00:00:04.685 Found a total of 2 nodes with the 'sorcerer' label 00:00:04.694 [Pipeline] httpRequest 00:00:04.698 HttpMethod: GET 00:00:04.699 URL: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:04.705 Sending request to url: http://10.211.164.101/packages/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:04.708 Response Code: HTTP/1.1 200 OK 00:00:04.709 Success: Status code 200 is in the accepted range: 200,404 00:00:04.709 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:05.126 [Pipeline] sh 00:00:05.407 + tar --no-same-owner -xf jbp_9bbc799d7020f50509d938dbe97dc05da0c1b5c3.tar.gz 00:00:05.422 [Pipeline] httpRequest 00:00:05.426 HttpMethod: GET 00:00:05.427 URL: http://10.211.164.101/packages/spdk_8d1bffc3d58884e0a589bb56f9ca84b9a6b73f21.tar.gz 00:00:05.427 Sending request to url: http://10.211.164.101/packages/spdk_8d1bffc3d58884e0a589bb56f9ca84b9a6b73f21.tar.gz 00:00:05.434 Response Code: HTTP/1.1 200 OK 00:00:05.435 Success: Status code 200 is in the accepted range: 200,404 00:00:05.435 Saving response body to /var/jenkins/workspace/crypto-phy-autotest/spdk_8d1bffc3d58884e0a589bb56f9ca84b9a6b73f21.tar.gz 00:00:23.064 [Pipeline] sh 00:00:23.347 + tar --no-same-owner -xf spdk_8d1bffc3d58884e0a589bb56f9ca84b9a6b73f21.tar.gz 00:00:27.549 [Pipeline] sh 00:00:27.830 + git -C spdk log --oneline -n5 00:00:27.830 8d1bffc3d nvmf/tcp: Add support for the interrupt mode in NVMe-of TCP 00:00:27.830 e493b64f5 nvmf/tcp: move await_req handling to nvmf_tcp_req_put() 00:00:27.830 38611b13c nvmf: move register nvmf_poll_group_poll interrupt to nvmf 00:00:27.830 e08210f7f nvmf/tcp: replace pending_buf_queue with iobuf callbacks 00:00:27.830 06f725e17 nvmf: extend API to request buffer with iobuf callback 00:00:27.840 [Pipeline] } 00:00:27.856 [Pipeline] // stage 00:00:27.864 [Pipeline] stage 00:00:27.866 [Pipeline] { (Prepare) 00:00:27.883 [Pipeline] writeFile 00:00:27.900 [Pipeline] sh 00:00:28.180 + logger -p user.info -t JENKINS-CI 00:00:28.193 [Pipeline] sh 00:00:28.474 + logger -p user.info -t JENKINS-CI 00:00:28.487 [Pipeline] sh 00:00:28.769 + cat autorun-spdk.conf 00:00:28.769 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:28.769 SPDK_TEST_BLOCKDEV=1 00:00:28.769 SPDK_TEST_ISAL=1 00:00:28.769 SPDK_TEST_CRYPTO=1 00:00:28.769 SPDK_TEST_REDUCE=1 00:00:28.769 SPDK_TEST_VBDEV_COMPRESS=1 00:00:28.769 SPDK_RUN_UBSAN=1 00:00:28.777 RUN_NIGHTLY=0 00:00:28.782 [Pipeline] readFile 00:00:28.809 [Pipeline] withEnv 00:00:28.812 [Pipeline] { 00:00:28.826 [Pipeline] sh 00:00:29.141 + set -ex 00:00:29.141 + [[ -f /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf ]] 00:00:29.141 + source /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:29.141 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:29.141 ++ SPDK_TEST_BLOCKDEV=1 00:00:29.141 ++ SPDK_TEST_ISAL=1 00:00:29.141 ++ SPDK_TEST_CRYPTO=1 00:00:29.141 ++ SPDK_TEST_REDUCE=1 00:00:29.141 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:29.141 ++ SPDK_RUN_UBSAN=1 00:00:29.141 ++ RUN_NIGHTLY=0 00:00:29.141 + case $SPDK_TEST_NVMF_NICS in 00:00:29.141 + DRIVERS= 00:00:29.141 + [[ -n '' ]] 00:00:29.141 + exit 0 00:00:29.151 [Pipeline] } 00:00:29.170 [Pipeline] // withEnv 00:00:29.177 [Pipeline] } 00:00:29.194 [Pipeline] // stage 00:00:29.205 [Pipeline] catchError 00:00:29.207 [Pipeline] { 00:00:29.226 [Pipeline] timeout 00:00:29.226 Timeout set to expire in 40 min 00:00:29.228 [Pipeline] { 00:00:29.245 [Pipeline] stage 00:00:29.247 [Pipeline] { (Tests) 00:00:29.266 [Pipeline] sh 00:00:29.550 + jbp/jenkins/jjb-config/jobs/scripts/autoruner.sh /var/jenkins/workspace/crypto-phy-autotest 00:00:29.550 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest 00:00:29.550 + DIR_ROOT=/var/jenkins/workspace/crypto-phy-autotest 00:00:29.550 + [[ -n /var/jenkins/workspace/crypto-phy-autotest ]] 00:00:29.550 + DIR_SPDK=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:29.550 + DIR_OUTPUT=/var/jenkins/workspace/crypto-phy-autotest/output 00:00:29.550 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/spdk ]] 00:00:29.550 + [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:29.550 + mkdir -p /var/jenkins/workspace/crypto-phy-autotest/output 00:00:29.550 + [[ -d /var/jenkins/workspace/crypto-phy-autotest/output ]] 00:00:29.550 + [[ crypto-phy-autotest == pkgdep-* ]] 00:00:29.550 + cd /var/jenkins/workspace/crypto-phy-autotest 00:00:29.550 + source /etc/os-release 00:00:29.550 ++ NAME='Fedora Linux' 00:00:29.550 ++ VERSION='38 (Cloud Edition)' 00:00:29.550 ++ ID=fedora 00:00:29.550 ++ VERSION_ID=38 00:00:29.550 ++ VERSION_CODENAME= 00:00:29.550 ++ PLATFORM_ID=platform:f38 00:00:29.550 ++ PRETTY_NAME='Fedora Linux 38 (Cloud Edition)' 00:00:29.550 ++ ANSI_COLOR='0;38;2;60;110;180' 00:00:29.550 ++ LOGO=fedora-logo-icon 00:00:29.550 ++ CPE_NAME=cpe:/o:fedoraproject:fedora:38 00:00:29.550 ++ HOME_URL=https://fedoraproject.org/ 00:00:29.550 ++ DOCUMENTATION_URL=https://docs.fedoraproject.org/en-US/fedora/f38/system-administrators-guide/ 00:00:29.550 ++ SUPPORT_URL=https://ask.fedoraproject.org/ 00:00:29.550 ++ BUG_REPORT_URL=https://bugzilla.redhat.com/ 00:00:29.550 ++ REDHAT_BUGZILLA_PRODUCT=Fedora 00:00:29.550 ++ REDHAT_BUGZILLA_PRODUCT_VERSION=38 00:00:29.550 ++ REDHAT_SUPPORT_PRODUCT=Fedora 00:00:29.550 ++ REDHAT_SUPPORT_PRODUCT_VERSION=38 00:00:29.550 ++ SUPPORT_END=2024-05-14 00:00:29.550 ++ VARIANT='Cloud Edition' 00:00:29.550 ++ VARIANT_ID=cloud 00:00:29.550 + uname -a 00:00:29.550 Linux spdk-wfp-03 6.7.0-68.fc38.x86_64 #1 SMP PREEMPT_DYNAMIC Mon Jan 15 02:47:10 UTC 2024 x86_64 GNU/Linux 00:00:29.550 + sudo /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:00:32.084 Hugepages 00:00:32.084 node hugesize free / total 00:00:32.084 node0 1048576kB 0 / 0 00:00:32.084 node0 2048kB 0 / 0 00:00:32.084 node1 1048576kB 0 / 0 00:00:32.084 node1 2048kB 0 / 0 00:00:32.084 00:00:32.084 Type BDF Vendor Device NUMA Driver Device Block devices 00:00:32.084 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:00:32.084 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:00:32.343 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme1 nvme1n1 00:00:32.343 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme0 nvme0n1 nvme0n2 00:00:32.343 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:00:32.343 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:00:32.343 + rm -f /tmp/spdk-ld-path 00:00:32.343 + source autorun-spdk.conf 00:00:32.343 ++ SPDK_RUN_FUNCTIONAL_TEST=1 00:00:32.343 ++ SPDK_TEST_BLOCKDEV=1 00:00:32.343 ++ SPDK_TEST_ISAL=1 00:00:32.343 ++ SPDK_TEST_CRYPTO=1 00:00:32.343 ++ SPDK_TEST_REDUCE=1 00:00:32.343 ++ SPDK_TEST_VBDEV_COMPRESS=1 00:00:32.343 ++ SPDK_RUN_UBSAN=1 00:00:32.343 ++ RUN_NIGHTLY=0 00:00:32.343 + (( SPDK_TEST_NVME_CMB == 1 || SPDK_TEST_NVME_PMR == 1 )) 00:00:32.343 + [[ -n '' ]] 00:00:32.343 + sudo git config --global --add safe.directory /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:32.343 + for M in /var/spdk/build-*-manifest.txt 00:00:32.343 + [[ -f /var/spdk/build-pkg-manifest.txt ]] 00:00:32.343 + cp /var/spdk/build-pkg-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:32.343 + for M in /var/spdk/build-*-manifest.txt 00:00:32.343 + [[ -f /var/spdk/build-repo-manifest.txt ]] 00:00:32.343 + cp /var/spdk/build-repo-manifest.txt /var/jenkins/workspace/crypto-phy-autotest/output/ 00:00:32.343 ++ uname 00:00:32.343 + [[ Linux == \L\i\n\u\x ]] 00:00:32.343 + sudo dmesg -T 00:00:32.343 + sudo dmesg --clear 00:00:32.343 + dmesg_pid=2458086 00:00:32.343 + [[ Fedora Linux == FreeBSD ]] 00:00:32.343 + export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:32.343 + UNBIND_ENTIRE_IOMMU_GROUP=yes 00:00:32.343 + [[ -e /var/spdk/dependencies/vhost/spdk_test_image.qcow2 ]] 00:00:32.343 + [[ -x /usr/src/fio-static/fio ]] 00:00:32.343 + export FIO_BIN=/usr/src/fio-static/fio 00:00:32.343 + FIO_BIN=/usr/src/fio-static/fio 00:00:32.343 + sudo dmesg -Tw 00:00:32.343 + [[ '' == \/\v\a\r\/\j\e\n\k\i\n\s\/\w\o\r\k\s\p\a\c\e\/\c\r\y\p\t\o\-\p\h\y\-\a\u\t\o\t\e\s\t\/\q\e\m\u\_\v\f\i\o\/* ]] 00:00:32.343 + [[ ! -v VFIO_QEMU_BIN ]] 00:00:32.343 + [[ -e /usr/local/qemu/vfio-user-latest ]] 00:00:32.343 + export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:32.343 + VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:00:32.343 + [[ -e /usr/local/qemu/vanilla-latest ]] 00:00:32.343 + export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:32.343 + QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:00:32.343 + spdk/autorun.sh /var/jenkins/workspace/crypto-phy-autotest/autorun-spdk.conf 00:00:32.343 Test configuration: 00:00:32.343 SPDK_RUN_FUNCTIONAL_TEST=1 00:00:32.343 SPDK_TEST_BLOCKDEV=1 00:00:32.343 SPDK_TEST_ISAL=1 00:00:32.343 SPDK_TEST_CRYPTO=1 00:00:32.343 SPDK_TEST_REDUCE=1 00:00:32.343 SPDK_TEST_VBDEV_COMPRESS=1 00:00:32.343 SPDK_RUN_UBSAN=1 00:00:32.602 RUN_NIGHTLY=0 15:39:37 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:00:32.602 15:39:37 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:00:32.602 15:39:37 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:00:32.602 15:39:37 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:00:32.602 15:39:37 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:32.602 15:39:37 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:32.602 15:39:37 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:32.602 15:39:37 -- paths/export.sh@5 -- $ export PATH 00:00:32.602 15:39:37 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:00:32.602 15:39:37 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:00:32.602 15:39:37 -- common/autobuild_common.sh@437 -- $ date +%s 00:00:32.602 15:39:37 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718026777.XXXXXX 00:00:32.602 15:39:37 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718026777.T98eCJ 00:00:32.602 15:39:37 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:00:32.602 15:39:37 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:00:32.602 15:39:37 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:00:32.602 15:39:37 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:00:32.602 15:39:37 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:00:32.602 15:39:37 -- common/autobuild_common.sh@453 -- $ get_config_params 00:00:32.602 15:39:37 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:00:32.602 15:39:37 -- common/autotest_common.sh@10 -- $ set +x 00:00:32.602 15:39:37 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:00:32.602 15:39:37 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:00:32.602 15:39:37 -- pm/common@17 -- $ local monitor 00:00:32.602 15:39:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:32.602 15:39:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:32.602 15:39:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:32.602 15:39:37 -- pm/common@21 -- $ date +%s 00:00:32.602 15:39:37 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:00:32.602 15:39:37 -- pm/common@21 -- $ date +%s 00:00:32.602 15:39:37 -- pm/common@25 -- $ sleep 1 00:00:32.602 15:39:37 -- pm/common@21 -- $ date +%s 00:00:32.602 15:39:37 -- pm/common@21 -- $ date +%s 00:00:32.602 15:39:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718026777 00:00:32.602 15:39:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718026777 00:00:32.602 15:39:37 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718026777 00:00:32.602 15:39:37 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autobuild.sh.1718026777 00:00:32.602 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718026777_collect-vmstat.pm.log 00:00:32.602 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718026777_collect-cpu-temp.pm.log 00:00:32.602 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718026777_collect-cpu-load.pm.log 00:00:32.602 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autobuild.sh.1718026777_collect-bmc-pm.bmc.pm.log 00:00:33.539 15:39:38 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:00:33.539 15:39:38 -- spdk/autobuild.sh@11 -- $ SPDK_TEST_AUTOBUILD= 00:00:33.539 15:39:38 -- spdk/autobuild.sh@12 -- $ umask 022 00:00:33.539 15:39:38 -- spdk/autobuild.sh@13 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:00:33.539 15:39:38 -- spdk/autobuild.sh@16 -- $ date -u 00:00:33.539 Mon Jun 10 01:39:38 PM UTC 2024 00:00:33.539 15:39:38 -- spdk/autobuild.sh@17 -- $ git describe --tags 00:00:33.539 v24.09-pre-69-g8d1bffc3d 00:00:33.539 15:39:38 -- spdk/autobuild.sh@19 -- $ '[' 0 -eq 1 ']' 00:00:33.539 15:39:38 -- spdk/autobuild.sh@23 -- $ '[' 1 -eq 1 ']' 00:00:33.539 15:39:38 -- spdk/autobuild.sh@24 -- $ run_test ubsan echo 'using ubsan' 00:00:33.539 15:39:38 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:00:33.539 15:39:38 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:00:33.539 15:39:38 -- common/autotest_common.sh@10 -- $ set +x 00:00:33.539 ************************************ 00:00:33.539 START TEST ubsan 00:00:33.539 ************************************ 00:00:33.539 15:39:39 ubsan -- common/autotest_common.sh@1124 -- $ echo 'using ubsan' 00:00:33.539 using ubsan 00:00:33.539 00:00:33.539 real 0m0.000s 00:00:33.539 user 0m0.000s 00:00:33.539 sys 0m0.000s 00:00:33.539 15:39:39 ubsan -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:00:33.539 15:39:39 ubsan -- common/autotest_common.sh@10 -- $ set +x 00:00:33.539 ************************************ 00:00:33.539 END TEST ubsan 00:00:33.539 ************************************ 00:00:33.539 15:39:39 -- spdk/autobuild.sh@27 -- $ '[' -n '' ']' 00:00:33.539 15:39:39 -- spdk/autobuild.sh@31 -- $ case "$SPDK_TEST_AUTOBUILD" in 00:00:33.539 15:39:39 -- spdk/autobuild.sh@47 -- $ [[ 0 -eq 1 ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@51 -- $ [[ 0 -eq 1 ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@55 -- $ [[ -n '' ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@57 -- $ [[ 0 -eq 1 ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@59 -- $ [[ 0 -eq 1 ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@62 -- $ [[ 0 -eq 1 ]] 00:00:33.539 15:39:39 -- spdk/autobuild.sh@67 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/configure --enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk --with-shared 00:00:33.797 Using default SPDK env in /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:00:33.797 Using default DPDK in /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:00:34.056 Using 'verbs' RDMA provider 00:00:47.644 Configuring ISA-L (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal.log)...done. 00:01:02.539 Configuring ISA-L-crypto (logfile: /var/jenkins/workspace/crypto-phy-autotest/spdk/.spdk-isal-crypto.log)...done. 00:01:02.539 Creating mk/config.mk...done. 00:01:02.539 Creating mk/cc.flags.mk...done. 00:01:02.539 Type 'make' to build. 00:01:02.539 15:40:05 -- spdk/autobuild.sh@69 -- $ run_test make make -j96 00:01:02.539 15:40:05 -- common/autotest_common.sh@1100 -- $ '[' 3 -le 1 ']' 00:01:02.539 15:40:05 -- common/autotest_common.sh@1106 -- $ xtrace_disable 00:01:02.539 15:40:05 -- common/autotest_common.sh@10 -- $ set +x 00:01:02.539 ************************************ 00:01:02.539 START TEST make 00:01:02.539 ************************************ 00:01:02.539 15:40:05 make -- common/autotest_common.sh@1124 -- $ make -j96 00:01:02.539 make[1]: Nothing to be done for 'all'. 00:01:41.345 The Meson build system 00:01:41.345 Version: 1.3.1 00:01:41.345 Source dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk 00:01:41.345 Build dir: /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp 00:01:41.345 Build type: native build 00:01:41.345 Program cat found: YES (/usr/bin/cat) 00:01:41.345 Project name: DPDK 00:01:41.345 Project version: 24.03.0 00:01:41.345 C compiler for the host machine: cc (gcc 13.2.1 "cc (GCC) 13.2.1 20231011 (Red Hat 13.2.1-4)") 00:01:41.345 C linker for the host machine: cc ld.bfd 2.39-16 00:01:41.345 Host machine cpu family: x86_64 00:01:41.345 Host machine cpu: x86_64 00:01:41.345 Message: ## Building in Developer Mode ## 00:01:41.345 Program pkg-config found: YES (/usr/bin/pkg-config) 00:01:41.345 Program check-symbols.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/check-symbols.sh) 00:01:41.345 Program options-ibverbs-static.sh found: YES (/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/buildtools/options-ibverbs-static.sh) 00:01:41.345 Program python3 found: YES (/usr/bin/python3) 00:01:41.345 Program cat found: YES (/usr/bin/cat) 00:01:41.345 Compiler for C supports arguments -march=native: YES 00:01:41.345 Checking for size of "void *" : 8 00:01:41.345 Checking for size of "void *" : 8 (cached) 00:01:41.345 Compiler for C supports link arguments -Wl,--undefined-version: NO 00:01:41.345 Library m found: YES 00:01:41.345 Library numa found: YES 00:01:41.345 Has header "numaif.h" : YES 00:01:41.345 Library fdt found: NO 00:01:41.345 Library execinfo found: NO 00:01:41.345 Has header "execinfo.h" : YES 00:01:41.345 Found pkg-config: YES (/usr/bin/pkg-config) 1.8.0 00:01:41.345 Run-time dependency libarchive found: NO (tried pkgconfig) 00:01:41.345 Run-time dependency libbsd found: NO (tried pkgconfig) 00:01:41.345 Run-time dependency jansson found: NO (tried pkgconfig) 00:01:41.345 Run-time dependency openssl found: YES 3.0.9 00:01:41.345 Run-time dependency libpcap found: YES 1.10.4 00:01:41.345 Has header "pcap.h" with dependency libpcap: YES 00:01:41.345 Compiler for C supports arguments -Wcast-qual: YES 00:01:41.345 Compiler for C supports arguments -Wdeprecated: YES 00:01:41.345 Compiler for C supports arguments -Wformat: YES 00:01:41.345 Compiler for C supports arguments -Wformat-nonliteral: NO 00:01:41.345 Compiler for C supports arguments -Wformat-security: NO 00:01:41.345 Compiler for C supports arguments -Wmissing-declarations: YES 00:01:41.345 Compiler for C supports arguments -Wmissing-prototypes: YES 00:01:41.345 Compiler for C supports arguments -Wnested-externs: YES 00:01:41.345 Compiler for C supports arguments -Wold-style-definition: YES 00:01:41.345 Compiler for C supports arguments -Wpointer-arith: YES 00:01:41.345 Compiler for C supports arguments -Wsign-compare: YES 00:01:41.345 Compiler for C supports arguments -Wstrict-prototypes: YES 00:01:41.345 Compiler for C supports arguments -Wundef: YES 00:01:41.345 Compiler for C supports arguments -Wwrite-strings: YES 00:01:41.345 Compiler for C supports arguments -Wno-address-of-packed-member: YES 00:01:41.345 Compiler for C supports arguments -Wno-packed-not-aligned: YES 00:01:41.345 Compiler for C supports arguments -Wno-missing-field-initializers: YES 00:01:41.345 Compiler for C supports arguments -Wno-zero-length-bounds: YES 00:01:41.345 Program objdump found: YES (/usr/bin/objdump) 00:01:41.345 Compiler for C supports arguments -mavx512f: YES 00:01:41.345 Checking if "AVX512 checking" compiles: YES 00:01:41.345 Fetching value of define "__SSE4_2__" : 1 00:01:41.345 Fetching value of define "__AES__" : 1 00:01:41.345 Fetching value of define "__AVX__" : 1 00:01:41.345 Fetching value of define "__AVX2__" : 1 00:01:41.345 Fetching value of define "__AVX512BW__" : 1 00:01:41.345 Fetching value of define "__AVX512CD__" : 1 00:01:41.345 Fetching value of define "__AVX512DQ__" : 1 00:01:41.345 Fetching value of define "__AVX512F__" : 1 00:01:41.345 Fetching value of define "__AVX512VL__" : 1 00:01:41.345 Fetching value of define "__PCLMUL__" : 1 00:01:41.345 Fetching value of define "__RDRND__" : 1 00:01:41.345 Fetching value of define "__RDSEED__" : 1 00:01:41.345 Fetching value of define "__VPCLMULQDQ__" : (undefined) 00:01:41.345 Fetching value of define "__znver1__" : (undefined) 00:01:41.345 Fetching value of define "__znver2__" : (undefined) 00:01:41.345 Fetching value of define "__znver3__" : (undefined) 00:01:41.345 Fetching value of define "__znver4__" : (undefined) 00:01:41.345 Compiler for C supports arguments -Wno-format-truncation: YES 00:01:41.345 Message: lib/log: Defining dependency "log" 00:01:41.345 Message: lib/kvargs: Defining dependency "kvargs" 00:01:41.345 Message: lib/telemetry: Defining dependency "telemetry" 00:01:41.345 Checking for function "getentropy" : NO 00:01:41.345 Message: lib/eal: Defining dependency "eal" 00:01:41.345 Message: lib/ring: Defining dependency "ring" 00:01:41.345 Message: lib/rcu: Defining dependency "rcu" 00:01:41.345 Message: lib/mempool: Defining dependency "mempool" 00:01:41.345 Message: lib/mbuf: Defining dependency "mbuf" 00:01:41.345 Fetching value of define "__PCLMUL__" : 1 (cached) 00:01:41.345 Fetching value of define "__AVX512F__" : 1 (cached) 00:01:41.345 Fetching value of define "__AVX512BW__" : 1 (cached) 00:01:41.345 Fetching value of define "__AVX512DQ__" : 1 (cached) 00:01:41.345 Fetching value of define "__AVX512VL__" : 1 (cached) 00:01:41.345 Fetching value of define "__VPCLMULQDQ__" : (undefined) (cached) 00:01:41.345 Compiler for C supports arguments -mpclmul: YES 00:01:41.345 Compiler for C supports arguments -maes: YES 00:01:41.345 Compiler for C supports arguments -mavx512f: YES (cached) 00:01:41.345 Compiler for C supports arguments -mavx512bw: YES 00:01:41.345 Compiler for C supports arguments -mavx512dq: YES 00:01:41.345 Compiler for C supports arguments -mavx512vl: YES 00:01:41.345 Compiler for C supports arguments -mvpclmulqdq: YES 00:01:41.345 Compiler for C supports arguments -mavx2: YES 00:01:41.345 Compiler for C supports arguments -mavx: YES 00:01:41.345 Message: lib/net: Defining dependency "net" 00:01:41.345 Message: lib/meter: Defining dependency "meter" 00:01:41.345 Message: lib/ethdev: Defining dependency "ethdev" 00:01:41.346 Message: lib/pci: Defining dependency "pci" 00:01:41.346 Message: lib/cmdline: Defining dependency "cmdline" 00:01:41.346 Message: lib/hash: Defining dependency "hash" 00:01:41.346 Message: lib/timer: Defining dependency "timer" 00:01:41.346 Message: lib/compressdev: Defining dependency "compressdev" 00:01:41.346 Message: lib/cryptodev: Defining dependency "cryptodev" 00:01:41.346 Message: lib/dmadev: Defining dependency "dmadev" 00:01:41.346 Compiler for C supports arguments -Wno-cast-qual: YES 00:01:41.346 Message: lib/power: Defining dependency "power" 00:01:41.346 Message: lib/reorder: Defining dependency "reorder" 00:01:41.346 Message: lib/security: Defining dependency "security" 00:01:41.346 Has header "linux/userfaultfd.h" : YES 00:01:41.346 Has header "linux/vduse.h" : YES 00:01:41.346 Message: lib/vhost: Defining dependency "vhost" 00:01:41.346 Compiler for C supports arguments -Wno-format-truncation: YES (cached) 00:01:41.346 Message: drivers/bus/auxiliary: Defining dependency "bus_auxiliary" 00:01:41.346 Message: drivers/bus/pci: Defining dependency "bus_pci" 00:01:41.346 Message: drivers/bus/vdev: Defining dependency "bus_vdev" 00:01:41.346 Compiler for C supports arguments -std=c11: YES 00:01:41.346 Compiler for C supports arguments -Wno-strict-prototypes: YES 00:01:41.346 Compiler for C supports arguments -D_BSD_SOURCE: YES 00:01:41.346 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES 00:01:41.346 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES 00:01:41.346 Run-time dependency libmlx5 found: YES 1.24.46.0 00:01:41.346 Run-time dependency libibverbs found: YES 1.14.46.0 00:01:41.346 Library mtcr_ul found: NO 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_ESP" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_RX_HASH_IPSEC_SPI" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_ACCESS_RELAXED_ORDERING " with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQE_RES_FORMAT_CSUM_STRIDX" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_MASK_TUNNEL_OFFLOADS" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_MPW_ALLOWED" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CONTEXT_FLAGS_CQE_128B_COMP" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_CQ_INIT_ATTR_FLAGS_CQE_PAD" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_flow_action_packet_reformat" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_FLOW_SPEC_MPLS" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAGS_PCI_WRITE_END_PADDING" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/verbs.h" has symbol "IBV_WQ_FLAG_RX_END_PADDING" with dependencies libmlx5, libibverbs: NO 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_devx_port" with dependencies libmlx5, libibverbs: NO 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_query_port" with dependencies libmlx5, libibverbs: YES 00:01:41.346 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_ib_port" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_create" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_COUNTERS_DEVX" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_FLOW_ACTION_DEFAULT_MISS" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_obj_query_async" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_qp_query" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_pp_alloc" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_devx_tir" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_devx_get_event" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_meter" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_MMAP_GET_NC_PAGES_CMD" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_NIC_RX" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_DOMAIN_TYPE_FDB" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_push_vlan" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_alloc_var" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ENHANCED_MPSW" with dependencies libmlx5, libibverbs: NO 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_SEND_EN" with dependencies libmlx5, libibverbs: NO 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_WAIT" with dependencies libmlx5, libibverbs: NO 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_OPCODE_ACCESS_ASO" with dependencies libmlx5, libibverbs: NO 00:01:42.285 Header "linux/if_link.h" has symbol "IFLA_NUM_VF" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "linux/if_link.h" has symbol "IFLA_EXT_MASK" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "linux/if_link.h" has symbol "IFLA_PHYS_SWITCH_ID" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "linux/if_link.h" has symbol "IFLA_PHYS_PORT_NAME" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NL_NLDEV" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_GET" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_CMD_PORT_GET" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_DEV_NAME" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_PORT_STATE" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "rdma/rdma_netlink.h" has symbol "RDMA_NLDEV_ATTR_NDEV_INDEX" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_domain" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_flow_sampler" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_set_reclaim_device_memory" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_array" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "linux/devlink.h" has symbol "DEVLINK_GENL_NAME" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_aso" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/verbs.h" has symbol "INFINIBAND_VERBS_H" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5_WQE_UMR_CTRL_FLAG_INLINE" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dump_dr_rule" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "MLX5DV_DR_ACTION_FLAGS_ASO_CT_DIRECTION_INITIATOR" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_domain_allow_duplicate_rules" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/verbs.h" has symbol "ibv_reg_mr_iova" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/verbs.h" has symbol "ibv_import_device" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_dr_action_create_dest_root_table" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/mlx5dv.h" has symbol "mlx5dv_create_steering_anchor" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Header "infiniband/verbs.h" has symbol "ibv_is_fork_initialized" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Checking whether type "struct mlx5dv_sw_parsing_caps" has member "sw_parsing_offloads" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Checking whether type "struct ibv_counter_set_init_attr" has member "counter_set_id" with dependencies libmlx5, libibverbs: NO 00:01:42.285 Checking whether type "struct ibv_counters_init_attr" has member "comp_mask" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Checking whether type "struct mlx5dv_devx_uar" has member "mmap_off" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Checking whether type "struct mlx5dv_flow_matcher_attr" has member "ft_type" with dependencies libmlx5, libibverbs: YES 00:01:42.285 Configuring mlx5_autoconf.h using configuration 00:01:42.285 Message: drivers/common/mlx5: Defining dependency "common_mlx5" 00:01:42.285 Run-time dependency libcrypto found: YES 3.0.9 00:01:42.285 Library IPSec_MB found: YES 00:01:42.285 Fetching value of define "IMB_VERSION_STR" : "1.5.0" 00:01:42.285 Message: drivers/common/qat: Defining dependency "common_qat" 00:01:42.285 Message: drivers/mempool/ring: Defining dependency "mempool_ring" 00:01:42.285 Message: Disabling raw/* drivers: missing internal dependency "rawdev" 00:01:42.285 Library IPSec_MB found: YES 00:01:42.285 Fetching value of define "IMB_VERSION_STR" : "1.5.0" (cached) 00:01:42.285 Message: drivers/crypto/ipsec_mb: Defining dependency "crypto_ipsec_mb" 00:01:42.285 Compiler for C supports arguments -std=c11: YES (cached) 00:01:42.285 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:42.285 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:42.285 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:42.285 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:42.286 Message: drivers/crypto/mlx5: Defining dependency "crypto_mlx5" 00:01:42.286 Run-time dependency libisal found: NO (tried pkgconfig) 00:01:42.286 Library libisal found: NO 00:01:42.286 Message: drivers/compress/isal: Defining dependency "compress_isal" 00:01:42.286 Compiler for C supports arguments -std=c11: YES (cached) 00:01:42.286 Compiler for C supports arguments -Wno-strict-prototypes: YES (cached) 00:01:42.286 Compiler for C supports arguments -D_BSD_SOURCE: YES (cached) 00:01:42.286 Compiler for C supports arguments -D_DEFAULT_SOURCE: YES (cached) 00:01:42.286 Compiler for C supports arguments -D_XOPEN_SOURCE=600: YES (cached) 00:01:42.286 Message: drivers/compress/mlx5: Defining dependency "compress_mlx5" 00:01:42.286 Message: Disabling regex/* drivers: missing internal dependency "regexdev" 00:01:42.286 Message: Disabling ml/* drivers: missing internal dependency "mldev" 00:01:42.286 Message: Disabling event/* drivers: missing internal dependency "eventdev" 00:01:42.286 Message: Disabling baseband/* drivers: missing internal dependency "bbdev" 00:01:42.286 Message: Disabling gpu/* drivers: missing internal dependency "gpudev" 00:01:42.286 Program doxygen found: YES (/usr/bin/doxygen) 00:01:42.286 Configuring doxy-api-html.conf using configuration 00:01:42.286 Configuring doxy-api-man.conf using configuration 00:01:42.286 Program mandb found: YES (/usr/bin/mandb) 00:01:42.286 Program sphinx-build found: NO 00:01:42.286 Configuring rte_build_config.h using configuration 00:01:42.286 Message: 00:01:42.286 ================= 00:01:42.286 Applications Enabled 00:01:42.286 ================= 00:01:42.286 00:01:42.286 apps: 00:01:42.286 00:01:42.286 00:01:42.286 Message: 00:01:42.286 ================= 00:01:42.286 Libraries Enabled 00:01:42.286 ================= 00:01:42.286 00:01:42.286 libs: 00:01:42.286 log, kvargs, telemetry, eal, ring, rcu, mempool, mbuf, 00:01:42.286 net, meter, ethdev, pci, cmdline, hash, timer, compressdev, 00:01:42.286 cryptodev, dmadev, power, reorder, security, vhost, 00:01:42.286 00:01:42.286 Message: 00:01:42.286 =============== 00:01:42.286 Drivers Enabled 00:01:42.286 =============== 00:01:42.286 00:01:42.286 common: 00:01:42.286 mlx5, qat, 00:01:42.286 bus: 00:01:42.286 auxiliary, pci, vdev, 00:01:42.286 mempool: 00:01:42.286 ring, 00:01:42.286 dma: 00:01:42.286 00:01:42.286 net: 00:01:42.286 00:01:42.286 crypto: 00:01:42.286 ipsec_mb, mlx5, 00:01:42.286 compress: 00:01:42.286 isal, mlx5, 00:01:42.286 vdpa: 00:01:42.286 00:01:42.286 00:01:42.286 Message: 00:01:42.286 ================= 00:01:42.286 Content Skipped 00:01:42.286 ================= 00:01:42.286 00:01:42.286 apps: 00:01:42.286 dumpcap: explicitly disabled via build config 00:01:42.286 graph: explicitly disabled via build config 00:01:42.286 pdump: explicitly disabled via build config 00:01:42.286 proc-info: explicitly disabled via build config 00:01:42.286 test-acl: explicitly disabled via build config 00:01:42.286 test-bbdev: explicitly disabled via build config 00:01:42.286 test-cmdline: explicitly disabled via build config 00:01:42.286 test-compress-perf: explicitly disabled via build config 00:01:42.286 test-crypto-perf: explicitly disabled via build config 00:01:42.286 test-dma-perf: explicitly disabled via build config 00:01:42.286 test-eventdev: explicitly disabled via build config 00:01:42.286 test-fib: explicitly disabled via build config 00:01:42.286 test-flow-perf: explicitly disabled via build config 00:01:42.286 test-gpudev: explicitly disabled via build config 00:01:42.286 test-mldev: explicitly disabled via build config 00:01:42.286 test-pipeline: explicitly disabled via build config 00:01:42.286 test-pmd: explicitly disabled via build config 00:01:42.286 test-regex: explicitly disabled via build config 00:01:42.286 test-sad: explicitly disabled via build config 00:01:42.286 test-security-perf: explicitly disabled via build config 00:01:42.286 00:01:42.286 libs: 00:01:42.286 argparse: explicitly disabled via build config 00:01:42.286 metrics: explicitly disabled via build config 00:01:42.286 acl: explicitly disabled via build config 00:01:42.286 bbdev: explicitly disabled via build config 00:01:42.286 bitratestats: explicitly disabled via build config 00:01:42.286 bpf: explicitly disabled via build config 00:01:42.286 cfgfile: explicitly disabled via build config 00:01:42.286 distributor: explicitly disabled via build config 00:01:42.286 efd: explicitly disabled via build config 00:01:42.286 eventdev: explicitly disabled via build config 00:01:42.286 dispatcher: explicitly disabled via build config 00:01:42.286 gpudev: explicitly disabled via build config 00:01:42.286 gro: explicitly disabled via build config 00:01:42.286 gso: explicitly disabled via build config 00:01:42.286 ip_frag: explicitly disabled via build config 00:01:42.286 jobstats: explicitly disabled via build config 00:01:42.286 latencystats: explicitly disabled via build config 00:01:42.286 lpm: explicitly disabled via build config 00:01:42.286 member: explicitly disabled via build config 00:01:42.286 pcapng: explicitly disabled via build config 00:01:42.286 rawdev: explicitly disabled via build config 00:01:42.286 regexdev: explicitly disabled via build config 00:01:42.286 mldev: explicitly disabled via build config 00:01:42.286 rib: explicitly disabled via build config 00:01:42.286 sched: explicitly disabled via build config 00:01:42.286 stack: explicitly disabled via build config 00:01:42.286 ipsec: explicitly disabled via build config 00:01:42.286 pdcp: explicitly disabled via build config 00:01:42.286 fib: explicitly disabled via build config 00:01:42.286 port: explicitly disabled via build config 00:01:42.286 pdump: explicitly disabled via build config 00:01:42.286 table: explicitly disabled via build config 00:01:42.286 pipeline: explicitly disabled via build config 00:01:42.286 graph: explicitly disabled via build config 00:01:42.286 node: explicitly disabled via build config 00:01:42.286 00:01:42.286 drivers: 00:01:42.286 common/cpt: not in enabled drivers build config 00:01:42.286 common/dpaax: not in enabled drivers build config 00:01:42.286 common/iavf: not in enabled drivers build config 00:01:42.286 common/idpf: not in enabled drivers build config 00:01:42.286 common/ionic: not in enabled drivers build config 00:01:42.286 common/mvep: not in enabled drivers build config 00:01:42.286 common/octeontx: not in enabled drivers build config 00:01:42.286 bus/cdx: not in enabled drivers build config 00:01:42.286 bus/dpaa: not in enabled drivers build config 00:01:42.286 bus/fslmc: not in enabled drivers build config 00:01:42.286 bus/ifpga: not in enabled drivers build config 00:01:42.286 bus/platform: not in enabled drivers build config 00:01:42.286 bus/uacce: not in enabled drivers build config 00:01:42.286 bus/vmbus: not in enabled drivers build config 00:01:42.286 common/cnxk: not in enabled drivers build config 00:01:42.286 common/nfp: not in enabled drivers build config 00:01:42.286 common/nitrox: not in enabled drivers build config 00:01:42.286 common/sfc_efx: not in enabled drivers build config 00:01:42.286 mempool/bucket: not in enabled drivers build config 00:01:42.286 mempool/cnxk: not in enabled drivers build config 00:01:42.286 mempool/dpaa: not in enabled drivers build config 00:01:42.286 mempool/dpaa2: not in enabled drivers build config 00:01:42.286 mempool/octeontx: not in enabled drivers build config 00:01:42.286 mempool/stack: not in enabled drivers build config 00:01:42.286 dma/cnxk: not in enabled drivers build config 00:01:42.286 dma/dpaa: not in enabled drivers build config 00:01:42.286 dma/dpaa2: not in enabled drivers build config 00:01:42.286 dma/hisilicon: not in enabled drivers build config 00:01:42.286 dma/idxd: not in enabled drivers build config 00:01:42.286 dma/ioat: not in enabled drivers build config 00:01:42.286 dma/skeleton: not in enabled drivers build config 00:01:42.286 net/af_packet: not in enabled drivers build config 00:01:42.286 net/af_xdp: not in enabled drivers build config 00:01:42.286 net/ark: not in enabled drivers build config 00:01:42.286 net/atlantic: not in enabled drivers build config 00:01:42.286 net/avp: not in enabled drivers build config 00:01:42.286 net/axgbe: not in enabled drivers build config 00:01:42.286 net/bnx2x: not in enabled drivers build config 00:01:42.286 net/bnxt: not in enabled drivers build config 00:01:42.286 net/bonding: not in enabled drivers build config 00:01:42.286 net/cnxk: not in enabled drivers build config 00:01:42.286 net/cpfl: not in enabled drivers build config 00:01:42.286 net/cxgbe: not in enabled drivers build config 00:01:42.286 net/dpaa: not in enabled drivers build config 00:01:42.286 net/dpaa2: not in enabled drivers build config 00:01:42.287 net/e1000: not in enabled drivers build config 00:01:42.287 net/ena: not in enabled drivers build config 00:01:42.287 net/enetc: not in enabled drivers build config 00:01:42.287 net/enetfec: not in enabled drivers build config 00:01:42.287 net/enic: not in enabled drivers build config 00:01:42.287 net/failsafe: not in enabled drivers build config 00:01:42.287 net/fm10k: not in enabled drivers build config 00:01:42.287 net/gve: not in enabled drivers build config 00:01:42.287 net/hinic: not in enabled drivers build config 00:01:42.287 net/hns3: not in enabled drivers build config 00:01:42.287 net/i40e: not in enabled drivers build config 00:01:42.287 net/iavf: not in enabled drivers build config 00:01:42.287 net/ice: not in enabled drivers build config 00:01:42.287 net/idpf: not in enabled drivers build config 00:01:42.287 net/igc: not in enabled drivers build config 00:01:42.287 net/ionic: not in enabled drivers build config 00:01:42.287 net/ipn3ke: not in enabled drivers build config 00:01:42.287 net/ixgbe: not in enabled drivers build config 00:01:42.287 net/mana: not in enabled drivers build config 00:01:42.287 net/memif: not in enabled drivers build config 00:01:42.287 net/mlx4: not in enabled drivers build config 00:01:42.287 net/mlx5: not in enabled drivers build config 00:01:42.287 net/mvneta: not in enabled drivers build config 00:01:42.287 net/mvpp2: not in enabled drivers build config 00:01:42.287 net/netvsc: not in enabled drivers build config 00:01:42.287 net/nfb: not in enabled drivers build config 00:01:42.287 net/nfp: not in enabled drivers build config 00:01:42.287 net/ngbe: not in enabled drivers build config 00:01:42.287 net/null: not in enabled drivers build config 00:01:42.287 net/octeontx: not in enabled drivers build config 00:01:42.287 net/octeon_ep: not in enabled drivers build config 00:01:42.287 net/pcap: not in enabled drivers build config 00:01:42.287 net/pfe: not in enabled drivers build config 00:01:42.287 net/qede: not in enabled drivers build config 00:01:42.287 net/ring: not in enabled drivers build config 00:01:42.287 net/sfc: not in enabled drivers build config 00:01:42.287 net/softnic: not in enabled drivers build config 00:01:42.287 net/tap: not in enabled drivers build config 00:01:42.287 net/thunderx: not in enabled drivers build config 00:01:42.287 net/txgbe: not in enabled drivers build config 00:01:42.287 net/vdev_netvsc: not in enabled drivers build config 00:01:42.287 net/vhost: not in enabled drivers build config 00:01:42.287 net/virtio: not in enabled drivers build config 00:01:42.287 net/vmxnet3: not in enabled drivers build config 00:01:42.287 raw/*: missing internal dependency, "rawdev" 00:01:42.287 crypto/armv8: not in enabled drivers build config 00:01:42.287 crypto/bcmfs: not in enabled drivers build config 00:01:42.287 crypto/caam_jr: not in enabled drivers build config 00:01:42.287 crypto/ccp: not in enabled drivers build config 00:01:42.287 crypto/cnxk: not in enabled drivers build config 00:01:42.287 crypto/dpaa_sec: not in enabled drivers build config 00:01:42.287 crypto/dpaa2_sec: not in enabled drivers build config 00:01:42.287 crypto/mvsam: not in enabled drivers build config 00:01:42.287 crypto/nitrox: not in enabled drivers build config 00:01:42.287 crypto/null: not in enabled drivers build config 00:01:42.287 crypto/octeontx: not in enabled drivers build config 00:01:42.287 crypto/openssl: not in enabled drivers build config 00:01:42.287 crypto/scheduler: not in enabled drivers build config 00:01:42.287 crypto/uadk: not in enabled drivers build config 00:01:42.287 crypto/virtio: not in enabled drivers build config 00:01:42.287 compress/nitrox: not in enabled drivers build config 00:01:42.287 compress/octeontx: not in enabled drivers build config 00:01:42.287 compress/zlib: not in enabled drivers build config 00:01:42.287 regex/*: missing internal dependency, "regexdev" 00:01:42.287 ml/*: missing internal dependency, "mldev" 00:01:42.287 vdpa/ifc: not in enabled drivers build config 00:01:42.287 vdpa/mlx5: not in enabled drivers build config 00:01:42.287 vdpa/nfp: not in enabled drivers build config 00:01:42.287 vdpa/sfc: not in enabled drivers build config 00:01:42.287 event/*: missing internal dependency, "eventdev" 00:01:42.287 baseband/*: missing internal dependency, "bbdev" 00:01:42.287 gpu/*: missing internal dependency, "gpudev" 00:01:42.287 00:01:42.287 00:01:42.856 Build targets in project: 115 00:01:42.856 00:01:42.856 DPDK 24.03.0 00:01:42.856 00:01:42.856 User defined options 00:01:42.856 buildtype : debug 00:01:42.856 default_library : shared 00:01:42.856 libdir : lib 00:01:42.856 prefix : /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:01:42.856 c_args : -Wno-stringop-overflow -fcommon -Wno-stringop-overread -Wno-array-bounds -I/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -DNO_COMPAT_IMB_API_053 -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l -I/var/jenkins/workspace/crypto-phy-autotest/spdk/isalbuild -fPIC -Werror 00:01:42.856 c_link_args : -L/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib -L/var/jenkins/workspace/crypto-phy-autotest/spdk/isa-l/.libs -lisal 00:01:42.856 cpu_instruction_set: native 00:01:42.856 disable_apps : test-sad,graph,test-regex,dumpcap,test-eventdev,test-compress-perf,pdump,test-security-perf,test-pmd,test-flow-perf,test-pipeline,test-crypto-perf,test-gpudev,test-cmdline,test-dma-perf,proc-info,test-bbdev,test-acl,test,test-mldev,test-fib 00:01:42.856 disable_libs : sched,port,dispatcher,graph,rawdev,pdcp,bitratestats,ipsec,pcapng,pdump,gso,cfgfile,gpudev,ip_frag,node,distributor,mldev,lpm,acl,bpf,latencystats,eventdev,regexdev,gro,stack,fib,argparse,pipeline,bbdev,table,metrics,member,jobstats,efd,rib 00:01:42.856 enable_docs : false 00:01:42.856 enable_drivers : bus,bus/pci,bus/vdev,mempool/ring,crypto/qat,compress/qat,common/qat,common/mlx5,bus/auxiliary,crypto,crypto/aesni_mb,crypto/mlx5,crypto/ipsec_mb,compress,compress/isal,compress/mlx5 00:01:42.856 enable_kmods : false 00:01:42.856 tests : false 00:01:42.856 00:01:42.856 Found ninja-1.11.1.git.kitware.jobserver-1 at /usr/local/bin/ninja 00:01:43.430 ninja: Entering directory `/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp' 00:01:43.430 [1/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hypervisor.c.o 00:01:43.430 [2/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_errno.c.o 00:01:43.430 [3/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_version.c.o 00:01:43.430 [4/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_cpuflags.c.o 00:01:43.431 [5/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_spinlock.c.o 00:01:43.431 [6/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_reciprocal.c.o 00:01:43.431 [7/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_hypervisor.c.o 00:01:43.431 [8/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cpuflags.c.o 00:01:43.431 [9/378] Compiling C object lib/librte_log.a.p/log_log_linux.c.o 00:01:43.431 [10/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_hexdump.c.o 00:01:43.431 [11/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_data.c.o 00:01:43.692 [12/378] Compiling C object lib/librte_kvargs.a.p/kvargs_rte_kvargs.c.o 00:01:43.692 [13/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_debug.c.o 00:01:43.692 [14/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_uuid.c.o 00:01:43.692 [15/378] Linking static target lib/librte_kvargs.a 00:01:43.692 [16/378] Compiling C object lib/librte_log.a.p/log_log.c.o 00:01:43.692 [17/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_string_fns.c.o 00:01:43.692 [18/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_cpuflags.c.o 00:01:43.692 [19/378] Linking static target lib/librte_log.a 00:01:43.692 [20/378] Compiling C object lib/librte_pci.a.p/pci_rte_pci.c.o 00:01:43.692 [21/378] Linking static target lib/librte_pci.a 00:01:43.692 [22/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline.c.o 00:01:43.692 [23/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_cirbuf.c.o 00:01:43.957 [24/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse.c.o 00:01:43.957 [25/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_num.c.o 00:01:43.957 [26/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_portlist.c.o 00:01:43.957 [27/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_string.c.o 00:01:43.957 [28/378] Generating lib/kvargs.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.219 [29/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dynmem.c.o 00:01:44.219 [30/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio_mp_sync.c.o 00:01:44.219 [31/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memalloc.c.o 00:01:44.219 [32/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_timer.c.o 00:01:44.219 [33/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_socket.c.o 00:01:44.219 [34/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_pool_ops.c.o 00:01:44.219 [35/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_utils.c.o 00:01:44.219 [36/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_rdline.c.o 00:01:44.219 [37/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_class.c.o 00:01:44.219 [38/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_debug.c.o 00:01:44.219 [39/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_mcfg.c.o 00:01:44.219 [40/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_config.c.o 00:01:44.219 [41/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_os_unix.c.o 00:01:44.219 [42/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_timer.c.o 00:01:44.219 [43/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_keepalive.c.o 00:01:44.219 [44/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_timer.c.o 00:01:44.219 [45/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_dyn.c.o 00:01:44.219 [46/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_random.c.o 00:01:44.219 [47/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_file.c.o 00:01:44.219 [48/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_lcore.c.o 00:01:44.219 [49/378] Generating lib/pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.219 [50/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry_legacy.c.o 00:01:44.219 [51/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_tailqs.c.o 00:01:44.219 [52/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_bus.c.o 00:01:44.219 [53/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_firmware.c.o 00:01:44.219 [54/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_launch.c.o 00:01:44.219 [55/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_filesystem.c.o 00:01:44.219 [56/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_vt100.c.o 00:01:44.219 [57/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_ctf.c.o 00:01:44.219 [58/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_thread.c.o 00:01:44.219 [59/378] Compiling C object lib/librte_hash.a.p/hash_rte_hash_crc.c.o 00:01:44.219 [60/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_cycles.c.o 00:01:44.219 [61/378] Compiling C object lib/librte_meter.a.p/meter_rte_meter.c.o 00:01:44.219 [62/378] Compiling C object lib/librte_net.a.p/net_rte_net_crc.c.o 00:01:44.219 [63/378] Compiling C object lib/librte_net.a.p/net_net_crc_sse.c.o 00:01:44.219 [64/378] Compiling C object lib/librte_eal.a.p/eal_unix_rte_thread.c.o 00:01:44.219 [65/378] Compiling C object lib/librte_eal.a.p/eal_unix_eal_unix_memory.c.o 00:01:44.219 [66/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_elem.c.o 00:01:44.219 [67/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_thread.c.o 00:01:44.219 [68/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_devargs.c.o 00:01:44.219 [69/378] Compiling C object lib/net/libnet_crc_avx512_lib.a.p/net_crc_avx512.c.o 00:01:44.219 [70/378] Linking static target lib/librte_meter.a 00:01:44.219 [71/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_dev.c.o 00:01:44.219 [72/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_lcore.c.o 00:01:44.219 [73/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_interrupts.c.o 00:01:44.219 [74/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace_points.c.o 00:01:44.219 [75/378] Compiling C object lib/librte_eal.a.p/eal_common_hotplug_mp.c.o 00:01:44.219 [76/378] Compiling C object lib/librte_eal.a.p/eal_x86_rte_power_intrinsics.c.o 00:01:44.219 [77/378] Linking static target lib/net/libnet_crc_avx512_lib.a 00:01:44.219 [78/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_mp.c.o 00:01:44.219 [79/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf_ptype.c.o 00:01:44.219 [80/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memzone.c.o 00:01:44.219 [81/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_thread.c.o 00:01:44.219 [82/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_alarm.c.o 00:01:44.219 [83/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops_default.c.o 00:01:44.219 [84/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_trace.c.o 00:01:44.219 [85/378] Compiling C object lib/librte_ring.a.p/ring_rte_ring.c.o 00:01:44.219 [86/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_dev.c.o 00:01:44.219 [87/378] Compiling C object lib/librte_mempool.a.p/mempool_mempool_trace_points.c.o 00:01:44.219 [88/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_class_eth.c.o 00:01:44.219 [89/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool_ops.c.o 00:01:44.219 [90/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8079.c.o 00:01:44.219 [91/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_hugepage_info.c.o 00:01:44.219 [92/378] Linking static target lib/librte_ring.a 00:01:44.219 [93/378] Compiling C object lib/librte_telemetry.a.p/telemetry_telemetry.c.o 00:01:44.219 [94/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_fbarray.c.o 00:01:44.482 [95/378] Compiling C object lib/librte_net.a.p/net_rte_ether.c.o 00:01:44.482 [96/378] Linking static target lib/librte_telemetry.a 00:01:44.482 [97/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_malloc.c.o 00:01:44.482 [98/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8636.c.o 00:01:44.482 [99/378] Compiling C object lib/librte_eal.a.p/eal_common_malloc_heap.c.o 00:01:44.482 [100/378] Compiling C object lib/librte_eal.a.p/eal_common_rte_service.c.o 00:01:44.482 [101/378] Compiling C object lib/librte_net.a.p/net_rte_net.c.o 00:01:44.482 [102/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_proc.c.o 00:01:44.482 [103/378] Compiling C object lib/librte_power.a.p/power_power_common.c.o 00:01:44.482 [104/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_memory.c.o 00:01:44.482 [105/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_etheraddr.c.o 00:01:44.482 [106/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev_params.c.o 00:01:44.482 [107/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal.c.o 00:01:44.482 [108/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_params.c.o 00:01:44.482 [109/378] Compiling C object lib/librte_power.a.p/power_power_kvm_vm.c.o 00:01:44.482 [110/378] Compiling C object lib/librte_net.a.p/net_rte_arp.c.o 00:01:44.482 [111/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_telemetry.c.o 00:01:44.482 [112/378] Compiling C object lib/librte_power.a.p/power_guest_channel.c.o 00:01:44.482 [113/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memalloc.c.o 00:01:44.482 [114/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_private.c.o 00:01:44.482 [115/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_vfio.c.o 00:01:44.482 [116/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_8472.c.o 00:01:44.482 [117/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_params.c.o 00:01:44.482 [118/378] Compiling C object lib/librte_vhost.a.p/vhost_fd_man.c.o 00:01:44.482 [119/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_profile.c.o 00:01:44.482 [120/378] Linking static target lib/librte_net.a 00:01:44.482 [121/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_memory.c.o 00:01:44.482 [122/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_logs.c.o 00:01:44.482 [123/378] Compiling C object lib/librte_eal.a.p/eal_linux_eal_interrupts.c.o 00:01:44.482 [124/378] Compiling C object lib/librte_mempool.a.p/mempool_rte_mempool.c.o 00:01:44.482 [125/378] Compiling C object lib/librte_eal.a.p/eal_common_eal_common_options.c.o 00:01:44.482 [126/378] Compiling C object lib/librte_rcu.a.p/rcu_rte_rcu_qsbr.c.o 00:01:44.482 [127/378] Compiling C object lib/librte_ethdev.a.p/ethdev_sff_common.c.o 00:01:44.482 [128/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_linux_ethtool.c.o 00:01:44.482 [129/378] Compiling C object lib/librte_cmdline.a.p/cmdline_cmdline_parse_ipaddr.c.o 00:01:44.482 [130/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_cman.c.o 00:01:44.482 [131/378] Linking static target lib/librte_mempool.a 00:01:44.482 [132/378] Linking static target lib/librte_rcu.a 00:01:44.482 [133/378] Linking static target lib/librte_cmdline.a 00:01:44.740 [134/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_driver.c.o 00:01:44.740 [135/378] Generating lib/log.sym_chk with a custom command (wrapped by meson to capture output) 00:01:44.740 [136/378] Linking static target lib/librte_eal.a 00:01:44.740 [137/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev_telemetry.c.o 00:01:44.740 [138/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_mtr.c.o 00:01:44.740 [139/378] Linking target lib/librte_log.so.24.1 00:01:44.740 [140/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_tm.c.o 00:01:44.740 [141/378] Compiling C object lib/librte_mbuf.a.p/mbuf_rte_mbuf.c.o 00:01:44.740 [142/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash_gfni.c.o 00:01:45.000 [143/378] Linking static target lib/librte_mbuf.a 00:01:45.000 [144/378] Compiling C object lib/librte_timer.a.p/timer_rte_timer.c.o 00:01:45.000 [145/378] Linking static target lib/librte_timer.a 00:01:45.000 [146/378] Generating lib/meter.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.000 [147/378] Compiling C object lib/librte_hash.a.p/hash_rte_thash.c.o 00:01:45.000 [148/378] Compiling C object lib/librte_hash.a.p/hash_rte_fbk_hash.c.o 00:01:45.000 [149/378] Compiling C object lib/librte_power.a.p/power_power_acpi_cpufreq.c.o 00:01:45.000 [150/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev_pmd.c.o 00:01:45.000 [151/378] Generating lib/ring.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.000 [152/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_glue.c.o 00:01:45.000 [153/378] Compiling C object lib/librte_power.a.p/power_rte_power_uncore.c.o 00:01:45.000 [154/378] Generating symbol file lib/librte_log.so.24.1.p/librte_log.so.24.1.symbols 00:01:45.000 [155/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_pmd.c.o 00:01:45.000 [156/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_auxiliary_common.c.o 00:01:45.000 [157/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev_trace_points.c.o 00:01:45.000 [158/378] Compiling C object lib/librte_power.a.p/power_power_cppc_cpufreq.c.o 00:01:45.000 [159/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_compressdev.c.o 00:01:45.000 [160/378] Compiling C object lib/librte_power.a.p/power_power_intel_uncore.c.o 00:01:45.000 [161/378] Compiling C object lib/librte_power.a.p/power_power_amd_pstate_cpufreq.c.o 00:01:45.000 [162/378] Compiling C object lib/librte_reorder.a.p/reorder_rte_reorder.c.o 00:01:45.000 [163/378] Linking target lib/librte_kvargs.so.24.1 00:01:45.000 [164/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common_uio.c.o 00:01:45.000 [165/378] Compiling C object drivers/libtmp_rte_bus_vdev.a.p/bus_vdev_vdev.c.o 00:01:45.000 [166/378] Generating lib/net.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.000 [167/378] Linking static target lib/librte_reorder.a 00:01:45.000 [168/378] Linking static target drivers/libtmp_rte_bus_vdev.a 00:01:45.000 [169/378] Compiling C object lib/librte_ethdev.a.p/ethdev_ethdev_trace_points.c.o 00:01:45.000 [170/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_pci_common.c.o 00:01:45.259 [171/378] Compiling C object lib/librte_power.a.p/power_rte_power.c.o 00:01:45.259 [172/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_cryptodev_trace_points.c.o 00:01:45.259 [173/378] Compiling C object lib/librte_dmadev.a.p/dmadev_rte_dmadev.c.o 00:01:45.259 [174/378] Compiling C object drivers/libtmp_rte_bus_auxiliary.a.p/bus_auxiliary_linux_auxiliary.c.o 00:01:45.259 [175/378] Generating lib/rcu.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.259 [176/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci.c.o 00:01:45.259 [177/378] Compiling C object lib/librte_compressdev.a.p/compressdev_rte_comp.c.o 00:01:45.259 [178/378] Linking static target lib/librte_dmadev.a 00:01:45.259 [179/378] Linking static target drivers/libtmp_rte_bus_auxiliary.a 00:01:45.259 [180/378] Linking static target lib/librte_compressdev.a 00:01:45.259 [181/378] Compiling C object lib/librte_power.a.p/power_rte_power_pmd_mgmt.c.o 00:01:45.259 [182/378] Compiling C object lib/librte_vhost.a.p/vhost_vdpa.c.o 00:01:45.259 [183/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_uio.c.o 00:01:45.259 [184/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_pf2vf.c.o 00:01:45.259 [185/378] Compiling C object lib/librte_power.a.p/power_power_pstate_cpufreq.c.o 00:01:45.259 [186/378] Generating lib/telemetry.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.259 [187/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_common.c.o 00:01:45.259 [188/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_flow.c.o 00:01:45.259 [189/378] Linking static target lib/librte_power.a 00:01:45.259 [190/378] Compiling C object lib/librte_vhost.a.p/vhost_iotlb.c.o 00:01:45.259 [191/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen1.c.o 00:01:45.259 [192/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen2.c.o 00:01:45.259 [193/378] Compiling C object lib/librte_hash.a.p/hash_rte_cuckoo_hash.c.o 00:01:45.259 [194/378] Linking target lib/librte_telemetry.so.24.1 00:01:45.259 [195/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net_ctrl.c.o 00:01:45.259 [196/378] Linking static target lib/librte_hash.a 00:01:45.259 [197/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen5.c.o 00:01:45.259 [198/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_nl.c.o 00:01:45.259 [199/378] Compiling C object lib/librte_security.a.p/security_rte_security.c.o 00:01:45.259 [200/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_malloc.c.o 00:01:45.259 [201/378] Generating symbol file lib/librte_kvargs.so.24.1.p/librte_kvargs.so.24.1.symbols 00:01:45.259 [202/378] Linking static target lib/librte_security.a 00:01:45.259 [203/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen2.c.o 00:01:45.259 [204/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen1.c.o 00:01:45.259 [205/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen3.c.o 00:01:45.518 [206/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen3.c.o 00:01:45.518 [207/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_verbs.c.o 00:01:45.518 [208/378] Compiling C object drivers/libtmp_rte_bus_pci.a.p/bus_pci_linux_pci_vfio.c.o 00:01:45.518 [209/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_dev_qat_comp_pmd_gen4.c.o 00:01:45.518 [210/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen_lce.c.o 00:01:45.518 [211/378] Linking static target drivers/libtmp_rte_bus_pci.a 00:01:45.518 [212/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_pci.c.o 00:01:45.518 [213/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp_pmd.c.o 00:01:45.518 [214/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen5.c.o 00:01:45.518 [215/378] Generating lib/timer.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.518 [216/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common.c.o 00:01:45.518 [217/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mp.c.o 00:01:45.518 [218/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_dev_qat_dev_gen4.c.o 00:01:45.518 [219/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_devx.c.o 00:01:45.518 [220/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym.c.o 00:01:45.518 [221/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen5.c.o 00:01:45.518 [222/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_utils.c.o 00:01:45.518 [223/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_device.c.o 00:01:45.518 [224/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_asym_pmd_gen1.c.o 00:01:45.518 [225/378] Generating drivers/rte_bus_vdev.pmd.c with a custom command 00:01:45.518 [226/378] Compiling C object lib/librte_vhost.a.p/vhost_vduse.c.o 00:01:45.518 [227/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_auxiliary.c.o 00:01:45.518 [228/378] Compiling C object lib/librte_vhost.a.p/vhost_socket.c.o 00:01:45.518 [229/378] Generating drivers/rte_bus_auxiliary.pmd.c with a custom command 00:01:45.518 [230/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_private.c.o 00:01:45.518 [231/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_crypto.c.o 00:01:45.518 [232/378] Compiling C object drivers/librte_bus_vdev.a.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:45.518 [233/378] Compiling C object drivers/librte_bus_vdev.so.24.1.p/meson-generated_.._rte_bus_vdev.pmd.c.o 00:01:45.518 [234/378] Linking static target drivers/librte_bus_vdev.a 00:01:45.518 [235/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen2.c.o 00:01:45.518 [236/378] Compiling C object drivers/librte_bus_auxiliary.so.24.1.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:45.518 [237/378] Compiling C object drivers/librte_bus_auxiliary.a.p/meson-generated_.._rte_bus_auxiliary.pmd.c.o 00:01:45.519 [238/378] Linking static target drivers/librte_bus_auxiliary.a 00:01:45.519 [239/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_linux_mlx5_common_os.c.o 00:01:45.519 [240/378] Generating symbol file lib/librte_telemetry.so.24.1.p/librte_telemetry.so.24.1.symbols 00:01:45.519 [241/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/common_qat_qat_qp.c.o 00:01:45.519 [242/378] Generating lib/reorder.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.519 [243/378] Compiling C object drivers/libtmp_rte_mempool_ring.a.p/mempool_ring_rte_mempool_ring.c.o 00:01:45.519 [244/378] Linking static target drivers/libtmp_rte_mempool_ring.a 00:01:45.519 [245/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen_lce.c.o 00:01:45.519 [246/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_common_mr.c.o 00:01:45.519 [247/378] Generating lib/mempool.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.779 [248/378] Generating lib/mbuf.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.779 [249/378] Generating drivers/rte_bus_pci.pmd.c with a custom command 00:01:45.779 [250/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_ipsec_mb_ops.c.o 00:01:45.779 [251/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost.c.o 00:01:45.779 [252/378] Compiling C object drivers/librte_bus_pci.a.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.779 [253/378] Compiling C object drivers/librte_bus_pci.so.24.1.p/meson-generated_.._rte_bus_pci.pmd.c.o 00:01:45.779 [254/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_kasumi.c.o 00:01:45.779 [255/378] Compiling C object drivers/libtmp_rte_compress_mlx5.a.p/compress_mlx5_mlx5_compress.c.o 00:01:45.779 [256/378] Linking static target drivers/librte_bus_pci.a 00:01:45.779 [257/378] Linking static target drivers/libtmp_rte_compress_mlx5.a 00:01:45.779 [258/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen4.c.o 00:01:45.779 [259/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_user.c.o 00:01:45.779 [260/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/compress_qat_qat_comp.c.o 00:01:45.779 [261/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd_ops.c.o 00:01:45.779 [262/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_dek.c.o 00:01:45.779 [263/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto.c.o 00:01:45.779 [264/378] Generating lib/dmadev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.779 [265/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_xts.c.o 00:01:45.779 [266/378] Generating drivers/rte_mempool_ring.pmd.c with a custom command 00:01:45.779 [267/378] Compiling C object lib/librte_cryptodev.a.p/cryptodev_rte_cryptodev.c.o 00:01:45.779 [268/378] Generating drivers/rte_bus_auxiliary.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.779 [269/378] Generating lib/compressdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.779 [270/378] Linking static target lib/librte_cryptodev.a 00:01:45.780 [271/378] Compiling C object drivers/librte_mempool_ring.so.24.1.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:45.780 [272/378] Generating drivers/rte_bus_vdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.780 [273/378] Compiling C object drivers/librte_mempool_ring.a.p/meson-generated_.._rte_mempool_ring.pmd.c.o 00:01:45.780 [274/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_chacha_poly.c.o 00:01:45.780 [275/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_gcm.c.o 00:01:45.780 [276/378] Compiling C object drivers/libtmp_rte_crypto_mlx5.a.p/crypto_mlx5_mlx5_crypto_gcm.c.o 00:01:45.780 [277/378] Linking static target drivers/librte_mempool_ring.a 00:01:45.780 [278/378] Generating lib/cmdline.sym_chk with a custom command (wrapped by meson to capture output) 00:01:45.780 [279/378] Linking static target drivers/libtmp_rte_crypto_mlx5.a 00:01:45.780 [280/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_zuc.c.o 00:01:45.780 [281/378] Compiling C object drivers/libtmp_rte_compress_isal.a.p/compress_isal_isal_compress_pmd.c.o 00:01:46.038 [282/378] Linking static target drivers/libtmp_rte_compress_isal.a 00:01:46.038 [283/378] Generating lib/security.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.038 [284/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_sym_session.c.o 00:01:46.038 [285/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_sym_pmd_gen1.c.o 00:01:46.038 [286/378] Generating drivers/rte_compress_mlx5.pmd.c with a custom command 00:01:46.038 [287/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_aesni_mb.c.o 00:01:46.038 [288/378] Compiling C object drivers/librte_compress_mlx5.so.24.1.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:46.038 [289/378] Compiling C object drivers/librte_compress_mlx5.a.p/meson-generated_.._rte_compress_mlx5.pmd.c.o 00:01:46.038 [290/378] Linking static target drivers/librte_compress_mlx5.a 00:01:46.038 [291/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_dev_qat_crypto_pmd_gen3.c.o 00:01:46.038 [292/378] Generating lib/power.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.038 [293/378] Generating drivers/rte_crypto_mlx5.pmd.c with a custom command 00:01:46.038 [294/378] Generating lib/hash.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.038 [295/378] Compiling C object drivers/libtmp_rte_crypto_ipsec_mb.a.p/crypto_ipsec_mb_pmd_snow3g.c.o 00:01:46.038 [296/378] Generating drivers/rte_compress_isal.pmd.c with a custom command 00:01:46.038 [297/378] Compiling C object drivers/libtmp_rte_common_mlx5.a.p/common_mlx5_mlx5_devx_cmds.c.o 00:01:46.038 [298/378] Compiling C object drivers/librte_crypto_mlx5.a.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:46.038 [299/378] Linking static target drivers/libtmp_rte_crypto_ipsec_mb.a 00:01:46.038 [300/378] Compiling C object drivers/librte_crypto_mlx5.so.24.1.p/meson-generated_.._rte_crypto_mlx5.pmd.c.o 00:01:46.295 [301/378] Linking static target drivers/librte_crypto_mlx5.a 00:01:46.295 [302/378] Compiling C object drivers/librte_compress_isal.so.24.1.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:46.295 [303/378] Linking static target drivers/libtmp_rte_common_mlx5.a 00:01:46.295 [304/378] Compiling C object drivers/librte_compress_isal.a.p/meson-generated_.._rte_compress_isal.pmd.c.o 00:01:46.295 [305/378] Linking static target drivers/librte_compress_isal.a 00:01:46.295 [306/378] Compiling C object lib/librte_ethdev.a.p/ethdev_rte_ethdev.c.o 00:01:46.296 [307/378] Linking static target lib/librte_ethdev.a 00:01:46.553 [308/378] Generating drivers/rte_crypto_ipsec_mb.pmd.c with a custom command 00:01:46.553 [309/378] Generating drivers/rte_common_mlx5.pmd.c with a custom command 00:01:46.553 [310/378] Generating drivers/rte_bus_pci.sym_chk with a custom command (wrapped by meson to capture output) 00:01:46.553 [311/378] Compiling C object drivers/librte_crypto_ipsec_mb.a.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:46.553 [312/378] Compiling C object drivers/librte_crypto_ipsec_mb.so.24.1.p/meson-generated_.._rte_crypto_ipsec_mb.pmd.c.o 00:01:46.553 [313/378] Linking static target drivers/librte_crypto_ipsec_mb.a 00:01:46.553 [314/378] Compiling C object drivers/librte_common_mlx5.a.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:46.553 [315/378] Compiling C object drivers/librte_common_mlx5.so.24.1.p/meson-generated_.._rte_common_mlx5.pmd.c.o 00:01:46.553 [316/378] Linking static target drivers/librte_common_mlx5.a 00:01:46.553 [317/378] Compiling C object lib/librte_vhost.a.p/vhost_vhost_crypto.c.o 00:01:46.815 [318/378] Compiling C object drivers/libtmp_rte_common_qat.a.p/crypto_qat_qat_asym.c.o 00:01:46.815 [319/378] Linking static target drivers/libtmp_rte_common_qat.a 00:01:47.073 [320/378] Generating drivers/rte_common_qat.pmd.c with a custom command 00:01:47.331 [321/378] Compiling C object drivers/librte_common_qat.a.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:47.331 [322/378] Compiling C object drivers/librte_common_qat.so.24.1.p/meson-generated_.._rte_common_qat.pmd.c.o 00:01:47.331 [323/378] Linking static target drivers/librte_common_qat.a 00:01:47.589 [324/378] Generating lib/cryptodev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:48.155 [325/378] Compiling C object lib/librte_vhost.a.p/vhost_virtio_net.c.o 00:01:48.155 [326/378] Linking static target lib/librte_vhost.a 00:01:50.054 [327/378] Generating lib/vhost.sym_chk with a custom command (wrapped by meson to capture output) 00:01:51.427 [328/378] Generating drivers/rte_common_mlx5.sym_chk with a custom command (wrapped by meson to capture output) 00:01:53.957 [329/378] Generating lib/ethdev.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.933 [330/378] Generating lib/eal.sym_chk with a custom command (wrapped by meson to capture output) 00:01:54.933 [331/378] Linking target lib/librte_eal.so.24.1 00:01:54.933 [332/378] Generating symbol file lib/librte_eal.so.24.1.p/librte_eal.so.24.1.symbols 00:01:54.933 [333/378] Linking target lib/librte_ring.so.24.1 00:01:54.933 [334/378] Linking target lib/librte_meter.so.24.1 00:01:54.933 [335/378] Linking target lib/librte_timer.so.24.1 00:01:54.933 [336/378] Linking target drivers/librte_bus_vdev.so.24.1 00:01:54.933 [337/378] Linking target lib/librte_pci.so.24.1 00:01:54.933 [338/378] Linking target lib/librte_dmadev.so.24.1 00:01:54.933 [339/378] Linking target drivers/librte_bus_auxiliary.so.24.1 00:01:55.190 [340/378] Generating symbol file lib/librte_dmadev.so.24.1.p/librte_dmadev.so.24.1.symbols 00:01:55.190 [341/378] Generating symbol file drivers/librte_bus_vdev.so.24.1.p/librte_bus_vdev.so.24.1.symbols 00:01:55.190 [342/378] Generating symbol file drivers/librte_bus_auxiliary.so.24.1.p/librte_bus_auxiliary.so.24.1.symbols 00:01:55.190 [343/378] Generating symbol file lib/librte_timer.so.24.1.p/librte_timer.so.24.1.symbols 00:01:55.190 [344/378] Generating symbol file lib/librte_meter.so.24.1.p/librte_meter.so.24.1.symbols 00:01:55.190 [345/378] Generating symbol file lib/librte_ring.so.24.1.p/librte_ring.so.24.1.symbols 00:01:55.190 [346/378] Generating symbol file lib/librte_pci.so.24.1.p/librte_pci.so.24.1.symbols 00:01:55.190 [347/378] Linking target lib/librte_rcu.so.24.1 00:01:55.190 [348/378] Linking target lib/librte_mempool.so.24.1 00:01:55.190 [349/378] Linking target drivers/librte_bus_pci.so.24.1 00:01:55.448 [350/378] Generating symbol file lib/librte_rcu.so.24.1.p/librte_rcu.so.24.1.symbols 00:01:55.448 [351/378] Generating symbol file drivers/librte_bus_pci.so.24.1.p/librte_bus_pci.so.24.1.symbols 00:01:55.448 [352/378] Generating symbol file lib/librte_mempool.so.24.1.p/librte_mempool.so.24.1.symbols 00:01:55.448 [353/378] Linking target lib/librte_mbuf.so.24.1 00:01:55.448 [354/378] Linking target drivers/librte_mempool_ring.so.24.1 00:01:55.706 [355/378] Generating symbol file lib/librte_mbuf.so.24.1.p/librte_mbuf.so.24.1.symbols 00:01:55.706 [356/378] Linking target lib/librte_reorder.so.24.1 00:01:55.706 [357/378] Linking target lib/librte_compressdev.so.24.1 00:01:55.706 [358/378] Linking target lib/librte_net.so.24.1 00:01:55.706 [359/378] Linking target lib/librte_cryptodev.so.24.1 00:01:55.706 [360/378] Generating symbol file lib/librte_cryptodev.so.24.1.p/librte_cryptodev.so.24.1.symbols 00:01:55.706 [361/378] Generating symbol file lib/librte_net.so.24.1.p/librte_net.so.24.1.symbols 00:01:55.706 [362/378] Generating symbol file lib/librte_compressdev.so.24.1.p/librte_compressdev.so.24.1.symbols 00:01:55.964 [363/378] Linking target lib/librte_security.so.24.1 00:01:55.964 [364/378] Linking target lib/librte_cmdline.so.24.1 00:01:55.964 [365/378] Linking target lib/librte_hash.so.24.1 00:01:55.964 [366/378] Linking target drivers/librte_compress_isal.so.24.1 00:01:55.964 [367/378] Linking target lib/librte_ethdev.so.24.1 00:01:55.964 [368/378] Generating symbol file lib/librte_hash.so.24.1.p/librte_hash.so.24.1.symbols 00:01:55.964 [369/378] Generating symbol file lib/librte_security.so.24.1.p/librte_security.so.24.1.symbols 00:01:55.964 [370/378] Generating symbol file lib/librte_ethdev.so.24.1.p/librte_ethdev.so.24.1.symbols 00:01:56.222 [371/378] Linking target lib/librte_power.so.24.1 00:01:56.222 [372/378] Linking target lib/librte_vhost.so.24.1 00:01:56.222 [373/378] Linking target drivers/librte_common_mlx5.so.24.1 00:01:56.222 [374/378] Generating symbol file drivers/librte_common_mlx5.so.24.1.p/librte_common_mlx5.so.24.1.symbols 00:01:56.222 [375/378] Linking target drivers/librte_crypto_mlx5.so.24.1 00:01:56.222 [376/378] Linking target drivers/librte_compress_mlx5.so.24.1 00:01:56.481 [377/378] Linking target drivers/librte_crypto_ipsec_mb.so.24.1 00:01:56.481 [378/378] Linking target drivers/librte_common_qat.so.24.1 00:01:56.481 INFO: autodetecting backend as ninja 00:01:56.481 INFO: calculating backend command to run: /usr/local/bin/ninja -C /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build-tmp -j 96 00:01:57.858 CC lib/ut/ut.o 00:01:57.858 CC lib/log/log.o 00:01:57.858 CC lib/log/log_flags.o 00:01:57.858 CC lib/log/log_deprecated.o 00:01:57.858 CC lib/ut_mock/mock.o 00:01:57.858 LIB libspdk_ut_mock.a 00:01:57.858 LIB libspdk_ut.a 00:01:57.858 LIB libspdk_log.a 00:01:57.858 SO libspdk_ut_mock.so.6.0 00:01:57.858 SO libspdk_ut.so.2.0 00:01:57.859 SO libspdk_log.so.7.0 00:01:57.859 SYMLINK libspdk_ut_mock.so 00:01:57.859 SYMLINK libspdk_ut.so 00:01:57.859 SYMLINK libspdk_log.so 00:01:58.118 CXX lib/trace_parser/trace.o 00:01:58.118 CC lib/util/base64.o 00:01:58.118 CC lib/util/bit_array.o 00:01:58.118 CC lib/util/cpuset.o 00:01:58.118 CC lib/util/crc16.o 00:01:58.118 CC lib/util/crc32.o 00:01:58.118 CC lib/util/crc32c.o 00:01:58.118 CC lib/util/crc32_ieee.o 00:01:58.118 CC lib/util/crc64.o 00:01:58.118 CC lib/ioat/ioat.o 00:01:58.118 CC lib/util/dif.o 00:01:58.118 CC lib/util/fd.o 00:01:58.118 CC lib/util/file.o 00:01:58.118 CC lib/util/hexlify.o 00:01:58.118 CC lib/util/iov.o 00:01:58.118 CC lib/util/math.o 00:01:58.118 CC lib/util/pipe.o 00:01:58.118 CC lib/dma/dma.o 00:01:58.118 CC lib/util/strerror_tls.o 00:01:58.118 CC lib/util/string.o 00:01:58.118 CC lib/util/uuid.o 00:01:58.118 CC lib/util/fd_group.o 00:01:58.118 CC lib/util/xor.o 00:01:58.118 CC lib/util/zipf.o 00:01:58.376 CC lib/vfio_user/host/vfio_user_pci.o 00:01:58.376 CC lib/vfio_user/host/vfio_user.o 00:01:58.376 LIB libspdk_dma.a 00:01:58.376 SO libspdk_dma.so.4.0 00:01:58.635 LIB libspdk_ioat.a 00:01:58.635 SO libspdk_ioat.so.7.0 00:01:58.635 SYMLINK libspdk_dma.so 00:01:58.635 SYMLINK libspdk_ioat.so 00:01:58.635 LIB libspdk_vfio_user.a 00:01:58.635 SO libspdk_vfio_user.so.5.0 00:01:58.635 SYMLINK libspdk_vfio_user.so 00:01:58.893 LIB libspdk_util.a 00:01:58.893 SO libspdk_util.so.9.1 00:01:59.151 SYMLINK libspdk_util.so 00:01:59.151 LIB libspdk_trace_parser.a 00:01:59.151 SO libspdk_trace_parser.so.5.0 00:01:59.409 SYMLINK libspdk_trace_parser.so 00:01:59.409 CC lib/idxd/idxd.o 00:01:59.409 CC lib/idxd/idxd_user.o 00:01:59.409 CC lib/json/json_parse.o 00:01:59.409 CC lib/idxd/idxd_kernel.o 00:01:59.409 CC lib/json/json_util.o 00:01:59.409 CC lib/json/json_write.o 00:01:59.409 CC lib/vmd/vmd.o 00:01:59.409 CC lib/vmd/led.o 00:01:59.409 CC lib/env_dpdk/env.o 00:01:59.409 CC lib/env_dpdk/memory.o 00:01:59.409 CC lib/env_dpdk/pci.o 00:01:59.409 CC lib/env_dpdk/init.o 00:01:59.409 CC lib/env_dpdk/threads.o 00:01:59.409 CC lib/env_dpdk/pci_ioat.o 00:01:59.409 CC lib/env_dpdk/pci_virtio.o 00:01:59.409 CC lib/env_dpdk/pci_vmd.o 00:01:59.409 CC lib/reduce/reduce.o 00:01:59.409 CC lib/conf/conf.o 00:01:59.409 CC lib/env_dpdk/pci_idxd.o 00:01:59.409 CC lib/rdma/common.o 00:01:59.409 CC lib/env_dpdk/pci_event.o 00:01:59.409 CC lib/rdma/rdma_verbs.o 00:01:59.409 CC lib/env_dpdk/sigbus_handler.o 00:01:59.409 CC lib/env_dpdk/pci_dpdk.o 00:01:59.409 CC lib/env_dpdk/pci_dpdk_2207.o 00:01:59.409 CC lib/env_dpdk/pci_dpdk_2211.o 00:01:59.668 LIB libspdk_conf.a 00:01:59.668 SO libspdk_conf.so.6.0 00:01:59.668 LIB libspdk_json.a 00:01:59.668 LIB libspdk_rdma.a 00:01:59.668 SYMLINK libspdk_conf.so 00:01:59.668 SO libspdk_json.so.6.0 00:01:59.668 SO libspdk_rdma.so.6.0 00:01:59.926 SYMLINK libspdk_json.so 00:01:59.926 SYMLINK libspdk_rdma.so 00:01:59.926 LIB libspdk_idxd.a 00:01:59.926 SO libspdk_idxd.so.12.0 00:01:59.926 LIB libspdk_vmd.a 00:01:59.926 LIB libspdk_reduce.a 00:02:00.183 SO libspdk_vmd.so.6.0 00:02:00.183 SYMLINK libspdk_idxd.so 00:02:00.183 SO libspdk_reduce.so.6.0 00:02:00.183 CC lib/jsonrpc/jsonrpc_server.o 00:02:00.183 CC lib/jsonrpc/jsonrpc_client.o 00:02:00.183 CC lib/jsonrpc/jsonrpc_server_tcp.o 00:02:00.183 CC lib/jsonrpc/jsonrpc_client_tcp.o 00:02:00.183 SYMLINK libspdk_vmd.so 00:02:00.183 SYMLINK libspdk_reduce.so 00:02:00.441 LIB libspdk_env_dpdk.a 00:02:00.441 LIB libspdk_jsonrpc.a 00:02:00.441 SO libspdk_jsonrpc.so.6.0 00:02:00.441 SO libspdk_env_dpdk.so.14.1 00:02:00.441 SYMLINK libspdk_jsonrpc.so 00:02:00.698 SYMLINK libspdk_env_dpdk.so 00:02:00.698 CC lib/rpc/rpc.o 00:02:00.955 LIB libspdk_rpc.a 00:02:01.213 SO libspdk_rpc.so.6.0 00:02:01.213 SYMLINK libspdk_rpc.so 00:02:01.471 CC lib/keyring/keyring.o 00:02:01.471 CC lib/keyring/keyring_rpc.o 00:02:01.471 CC lib/trace/trace.o 00:02:01.471 CC lib/trace/trace_flags.o 00:02:01.471 CC lib/trace/trace_rpc.o 00:02:01.471 CC lib/notify/notify.o 00:02:01.471 CC lib/notify/notify_rpc.o 00:02:01.729 LIB libspdk_notify.a 00:02:01.729 SO libspdk_notify.so.6.0 00:02:01.729 LIB libspdk_keyring.a 00:02:01.729 LIB libspdk_trace.a 00:02:01.729 SO libspdk_keyring.so.1.0 00:02:01.729 SYMLINK libspdk_notify.so 00:02:01.729 SO libspdk_trace.so.10.0 00:02:01.729 SYMLINK libspdk_keyring.so 00:02:01.987 SYMLINK libspdk_trace.so 00:02:02.246 CC lib/thread/thread.o 00:02:02.246 CC lib/thread/iobuf.o 00:02:02.246 CC lib/sock/sock.o 00:02:02.246 CC lib/sock/sock_rpc.o 00:02:02.504 LIB libspdk_sock.a 00:02:02.762 SO libspdk_sock.so.10.0 00:02:02.762 SYMLINK libspdk_sock.so 00:02:03.020 CC lib/nvme/nvme_ctrlr_cmd.o 00:02:03.020 CC lib/nvme/nvme_ctrlr.o 00:02:03.020 CC lib/nvme/nvme_fabric.o 00:02:03.020 CC lib/nvme/nvme_ns.o 00:02:03.020 CC lib/nvme/nvme_ns_cmd.o 00:02:03.020 CC lib/nvme/nvme_pcie_common.o 00:02:03.020 CC lib/nvme/nvme_pcie.o 00:02:03.020 CC lib/nvme/nvme.o 00:02:03.020 CC lib/nvme/nvme_qpair.o 00:02:03.020 CC lib/nvme/nvme_quirks.o 00:02:03.020 CC lib/nvme/nvme_transport.o 00:02:03.020 CC lib/nvme/nvme_discovery.o 00:02:03.021 CC lib/nvme/nvme_ctrlr_ocssd_cmd.o 00:02:03.021 CC lib/nvme/nvme_ns_ocssd_cmd.o 00:02:03.021 CC lib/nvme/nvme_tcp.o 00:02:03.021 CC lib/nvme/nvme_opal.o 00:02:03.021 CC lib/nvme/nvme_io_msg.o 00:02:03.021 CC lib/nvme/nvme_poll_group.o 00:02:03.021 CC lib/nvme/nvme_zns.o 00:02:03.021 CC lib/nvme/nvme_stubs.o 00:02:03.021 CC lib/nvme/nvme_auth.o 00:02:03.021 CC lib/nvme/nvme_cuse.o 00:02:03.021 CC lib/nvme/nvme_rdma.o 00:02:03.955 LIB libspdk_thread.a 00:02:03.955 SO libspdk_thread.so.10.1 00:02:03.955 SYMLINK libspdk_thread.so 00:02:04.212 CC lib/accel/accel.o 00:02:04.212 CC lib/accel/accel_rpc.o 00:02:04.212 CC lib/accel/accel_sw.o 00:02:04.212 CC lib/blob/blobstore.o 00:02:04.212 CC lib/blob/request.o 00:02:04.212 CC lib/blob/zeroes.o 00:02:04.212 CC lib/virtio/virtio.o 00:02:04.212 CC lib/init/json_config.o 00:02:04.212 CC lib/blob/blob_bs_dev.o 00:02:04.212 CC lib/virtio/virtio_vhost_user.o 00:02:04.212 CC lib/init/subsystem.o 00:02:04.212 CC lib/virtio/virtio_vfio_user.o 00:02:04.212 CC lib/init/subsystem_rpc.o 00:02:04.212 CC lib/virtio/virtio_pci.o 00:02:04.212 CC lib/init/rpc.o 00:02:04.470 LIB libspdk_init.a 00:02:04.470 SO libspdk_init.so.5.0 00:02:04.470 LIB libspdk_virtio.a 00:02:04.470 SO libspdk_virtio.so.7.0 00:02:04.470 SYMLINK libspdk_init.so 00:02:04.728 SYMLINK libspdk_virtio.so 00:02:04.985 CC lib/event/app.o 00:02:04.985 CC lib/event/reactor.o 00:02:04.985 CC lib/event/log_rpc.o 00:02:04.985 CC lib/event/app_rpc.o 00:02:04.985 CC lib/event/scheduler_static.o 00:02:05.243 LIB libspdk_accel.a 00:02:05.243 SO libspdk_accel.so.15.0 00:02:05.243 LIB libspdk_event.a 00:02:05.243 LIB libspdk_nvme.a 00:02:05.243 SO libspdk_event.so.13.1 00:02:05.243 SYMLINK libspdk_accel.so 00:02:05.501 SO libspdk_nvme.so.13.0 00:02:05.501 SYMLINK libspdk_event.so 00:02:05.759 CC lib/bdev/bdev.o 00:02:05.759 CC lib/bdev/bdev_rpc.o 00:02:05.759 CC lib/bdev/bdev_zone.o 00:02:05.759 CC lib/bdev/scsi_nvme.o 00:02:05.759 CC lib/bdev/part.o 00:02:05.759 SYMLINK libspdk_nvme.so 00:02:07.664 LIB libspdk_blob.a 00:02:07.664 SO libspdk_blob.so.11.0 00:02:07.664 LIB libspdk_bdev.a 00:02:07.664 SYMLINK libspdk_blob.so 00:02:07.664 SO libspdk_bdev.so.15.0 00:02:07.664 SYMLINK libspdk_bdev.so 00:02:07.664 CC lib/lvol/lvol.o 00:02:07.664 CC lib/blobfs/blobfs.o 00:02:07.664 CC lib/blobfs/tree.o 00:02:07.923 CC lib/scsi/lun.o 00:02:07.923 CC lib/scsi/dev.o 00:02:07.923 CC lib/scsi/port.o 00:02:07.923 CC lib/scsi/scsi.o 00:02:07.923 CC lib/scsi/scsi_bdev.o 00:02:07.923 CC lib/scsi/scsi_pr.o 00:02:07.923 CC lib/scsi/scsi_rpc.o 00:02:07.923 CC lib/scsi/task.o 00:02:07.923 CC lib/nbd/nbd.o 00:02:07.923 CC lib/nbd/nbd_rpc.o 00:02:07.923 CC lib/nvmf/ctrlr.o 00:02:07.923 CC lib/nvmf/ctrlr_discovery.o 00:02:07.923 CC lib/nvmf/ctrlr_bdev.o 00:02:07.923 CC lib/nvmf/subsystem.o 00:02:07.923 CC lib/nvmf/nvmf.o 00:02:07.923 CC lib/ftl/ftl_core.o 00:02:07.923 CC lib/nvmf/nvmf_rpc.o 00:02:07.923 CC lib/ftl/ftl_init.o 00:02:07.923 CC lib/nvmf/transport.o 00:02:07.923 CC lib/ftl/ftl_layout.o 00:02:07.923 CC lib/nvmf/tcp.o 00:02:07.923 CC lib/ftl/ftl_debug.o 00:02:07.923 CC lib/nvmf/stubs.o 00:02:07.923 CC lib/ftl/ftl_io.o 00:02:07.923 CC lib/nvmf/mdns_server.o 00:02:07.923 CC lib/ftl/ftl_sb.o 00:02:07.923 CC lib/nvmf/rdma.o 00:02:07.923 CC lib/ftl/ftl_l2p.o 00:02:07.923 CC lib/nvmf/auth.o 00:02:07.923 CC lib/ublk/ublk.o 00:02:07.923 CC lib/ftl/ftl_l2p_flat.o 00:02:07.923 CC lib/ublk/ublk_rpc.o 00:02:07.923 CC lib/ftl/ftl_band.o 00:02:07.923 CC lib/ftl/ftl_nv_cache.o 00:02:07.923 CC lib/ftl/ftl_writer.o 00:02:07.923 CC lib/ftl/ftl_band_ops.o 00:02:07.923 CC lib/ftl/ftl_rq.o 00:02:07.923 CC lib/ftl/ftl_reloc.o 00:02:07.923 CC lib/ftl/ftl_l2p_cache.o 00:02:07.923 CC lib/ftl/ftl_p2l.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_bdev.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_shutdown.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_startup.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_md.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_misc.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_ioch.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_l2p.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_band.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_p2l.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_self_test.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_recovery.o 00:02:07.923 CC lib/ftl/mngt/ftl_mngt_upgrade.o 00:02:07.923 CC lib/ftl/utils/ftl_conf.o 00:02:07.923 CC lib/ftl/utils/ftl_md.o 00:02:07.923 CC lib/ftl/utils/ftl_mempool.o 00:02:07.923 CC lib/ftl/utils/ftl_bitmap.o 00:02:07.923 CC lib/ftl/utils/ftl_property.o 00:02:07.923 CC lib/ftl/utils/ftl_layout_tracker_bdev.o 00:02:07.923 CC lib/ftl/upgrade/ftl_layout_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_sb_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_p2l_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_band_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_chunk_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_trim_upgrade.o 00:02:07.923 CC lib/ftl/upgrade/ftl_sb_v3.o 00:02:07.923 CC lib/ftl/nvc/ftl_nvc_dev.o 00:02:07.923 CC lib/ftl/upgrade/ftl_sb_v5.o 00:02:07.923 CC lib/ftl/nvc/ftl_nvc_bdev_vss.o 00:02:07.923 CC lib/ftl/base/ftl_base_dev.o 00:02:07.923 CC lib/ftl/ftl_trace.o 00:02:07.923 CC lib/ftl/base/ftl_base_bdev.o 00:02:08.489 LIB libspdk_blobfs.a 00:02:08.489 SO libspdk_blobfs.so.10.0 00:02:08.747 SYMLINK libspdk_blobfs.so 00:02:08.747 LIB libspdk_nbd.a 00:02:08.747 SO libspdk_nbd.so.7.0 00:02:08.747 LIB libspdk_scsi.a 00:02:08.747 SO libspdk_scsi.so.9.0 00:02:08.747 SYMLINK libspdk_nbd.so 00:02:08.747 LIB libspdk_ublk.a 00:02:08.747 SYMLINK libspdk_scsi.so 00:02:09.006 SO libspdk_ublk.so.3.0 00:02:09.006 LIB libspdk_ftl.a 00:02:09.006 SYMLINK libspdk_ublk.so 00:02:09.006 LIB libspdk_lvol.a 00:02:09.006 SO libspdk_lvol.so.10.0 00:02:09.006 SO libspdk_ftl.so.9.0 00:02:09.298 CC lib/iscsi/conn.o 00:02:09.298 CC lib/iscsi/init_grp.o 00:02:09.298 CC lib/iscsi/md5.o 00:02:09.298 CC lib/iscsi/iscsi.o 00:02:09.298 CC lib/iscsi/param.o 00:02:09.298 CC lib/iscsi/portal_grp.o 00:02:09.298 CC lib/iscsi/tgt_node.o 00:02:09.298 CC lib/iscsi/iscsi_subsystem.o 00:02:09.298 CC lib/iscsi/iscsi_rpc.o 00:02:09.298 CC lib/iscsi/task.o 00:02:09.298 CC lib/vhost/vhost.o 00:02:09.298 CC lib/vhost/vhost_rpc.o 00:02:09.298 CC lib/vhost/vhost_scsi.o 00:02:09.298 CC lib/vhost/vhost_blk.o 00:02:09.298 CC lib/vhost/rte_vhost_user.o 00:02:09.298 SYMLINK libspdk_lvol.so 00:02:09.607 SYMLINK libspdk_ftl.so 00:02:10.175 LIB libspdk_nvmf.a 00:02:10.175 SO libspdk_nvmf.so.19.0 00:02:10.433 LIB libspdk_vhost.a 00:02:10.433 SO libspdk_vhost.so.8.0 00:02:10.433 SYMLINK libspdk_nvmf.so 00:02:10.433 SYMLINK libspdk_vhost.so 00:02:10.691 LIB libspdk_iscsi.a 00:02:10.691 SO libspdk_iscsi.so.8.0 00:02:10.948 SYMLINK libspdk_iscsi.so 00:02:11.207 CC module/env_dpdk/env_dpdk_rpc.o 00:02:11.465 CC module/accel/error/accel_error.o 00:02:11.465 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev.o 00:02:11.465 CC module/accel/error/accel_error_rpc.o 00:02:11.465 CC module/accel/ioat/accel_ioat_rpc.o 00:02:11.465 CC module/accel/ioat/accel_ioat.o 00:02:11.465 CC module/accel/dpdk_compressdev/accel_dpdk_compressdev_rpc.o 00:02:11.465 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev.o 00:02:11.465 CC module/accel/iaa/accel_iaa.o 00:02:11.465 CC module/accel/dpdk_cryptodev/accel_dpdk_cryptodev_rpc.o 00:02:11.465 CC module/accel/dsa/accel_dsa.o 00:02:11.465 CC module/accel/dsa/accel_dsa_rpc.o 00:02:11.465 CC module/scheduler/dynamic/scheduler_dynamic.o 00:02:11.465 CC module/accel/iaa/accel_iaa_rpc.o 00:02:11.465 CC module/sock/posix/posix.o 00:02:11.465 CC module/scheduler/dpdk_governor/dpdk_governor.o 00:02:11.465 CC module/keyring/linux/keyring.o 00:02:11.465 CC module/keyring/linux/keyring_rpc.o 00:02:11.465 CC module/scheduler/gscheduler/gscheduler.o 00:02:11.465 CC module/blob/bdev/blob_bdev.o 00:02:11.465 CC module/keyring/file/keyring.o 00:02:11.465 CC module/keyring/file/keyring_rpc.o 00:02:11.465 LIB libspdk_env_dpdk_rpc.a 00:02:11.465 SO libspdk_env_dpdk_rpc.so.6.0 00:02:11.465 SYMLINK libspdk_env_dpdk_rpc.so 00:02:11.722 LIB libspdk_keyring_linux.a 00:02:11.722 LIB libspdk_scheduler_gscheduler.a 00:02:11.722 LIB libspdk_keyring_file.a 00:02:11.722 LIB libspdk_scheduler_dpdk_governor.a 00:02:11.722 LIB libspdk_accel_error.a 00:02:11.722 SO libspdk_keyring_linux.so.1.0 00:02:11.722 SO libspdk_scheduler_gscheduler.so.4.0 00:02:11.722 LIB libspdk_accel_ioat.a 00:02:11.722 LIB libspdk_accel_iaa.a 00:02:11.722 SO libspdk_keyring_file.so.1.0 00:02:11.722 SO libspdk_scheduler_dpdk_governor.so.4.0 00:02:11.722 LIB libspdk_scheduler_dynamic.a 00:02:11.722 SO libspdk_accel_error.so.2.0 00:02:11.722 SO libspdk_accel_iaa.so.3.0 00:02:11.722 SO libspdk_accel_ioat.so.6.0 00:02:11.722 SO libspdk_scheduler_dynamic.so.4.0 00:02:11.722 LIB libspdk_accel_dsa.a 00:02:11.722 SYMLINK libspdk_scheduler_gscheduler.so 00:02:11.722 SYMLINK libspdk_keyring_file.so 00:02:11.722 SYMLINK libspdk_keyring_linux.so 00:02:11.722 SYMLINK libspdk_scheduler_dpdk_governor.so 00:02:11.722 LIB libspdk_blob_bdev.a 00:02:11.722 SYMLINK libspdk_accel_error.so 00:02:11.722 SO libspdk_accel_dsa.so.5.0 00:02:11.722 SYMLINK libspdk_scheduler_dynamic.so 00:02:11.722 SYMLINK libspdk_accel_iaa.so 00:02:11.722 SYMLINK libspdk_accel_ioat.so 00:02:11.722 SO libspdk_blob_bdev.so.11.0 00:02:11.722 SYMLINK libspdk_accel_dsa.so 00:02:11.981 SYMLINK libspdk_blob_bdev.so 00:02:12.239 LIB libspdk_sock_posix.a 00:02:12.239 SO libspdk_sock_posix.so.6.0 00:02:12.239 LIB libspdk_accel_dpdk_compressdev.a 00:02:12.239 CC module/bdev/nvme/bdev_nvme_rpc.o 00:02:12.239 CC module/bdev/nvme/bdev_nvme.o 00:02:12.239 CC module/bdev/nvme/nvme_rpc.o 00:02:12.239 CC module/bdev/delay/vbdev_delay.o 00:02:12.239 CC module/bdev/nvme/vbdev_opal.o 00:02:12.239 CC module/bdev/nvme/bdev_mdns_client.o 00:02:12.239 CC module/bdev/nvme/vbdev_opal_rpc.o 00:02:12.239 CC module/bdev/nvme/bdev_nvme_cuse_rpc.o 00:02:12.239 CC module/bdev/delay/vbdev_delay_rpc.o 00:02:12.239 CC module/bdev/compress/vbdev_compress.o 00:02:12.239 CC module/bdev/compress/vbdev_compress_rpc.o 00:02:12.239 CC module/bdev/zone_block/vbdev_zone_block.o 00:02:12.239 CC module/bdev/error/vbdev_error_rpc.o 00:02:12.239 CC module/bdev/gpt/gpt.o 00:02:12.239 CC module/bdev/error/vbdev_error.o 00:02:12.239 CC module/bdev/split/vbdev_split.o 00:02:12.239 CC module/bdev/gpt/vbdev_gpt.o 00:02:12.239 CC module/bdev/zone_block/vbdev_zone_block_rpc.o 00:02:12.239 CC module/bdev/split/vbdev_split_rpc.o 00:02:12.239 CC module/blobfs/bdev/blobfs_bdev.o 00:02:12.239 CC module/blobfs/bdev/blobfs_bdev_rpc.o 00:02:12.239 CC module/bdev/passthru/vbdev_passthru.o 00:02:12.239 CC module/bdev/raid/bdev_raid.o 00:02:12.239 CC module/bdev/lvol/vbdev_lvol.o 00:02:12.239 CC module/bdev/passthru/vbdev_passthru_rpc.o 00:02:12.239 CC module/bdev/virtio/bdev_virtio_scsi.o 00:02:12.239 CC module/bdev/raid/bdev_raid_sb.o 00:02:12.239 CC module/bdev/raid/bdev_raid_rpc.o 00:02:12.239 CC module/bdev/virtio/bdev_virtio_blk.o 00:02:12.239 CC module/bdev/lvol/vbdev_lvol_rpc.o 00:02:12.239 SO libspdk_accel_dpdk_compressdev.so.3.0 00:02:12.239 CC module/bdev/raid/raid0.o 00:02:12.239 CC module/bdev/ftl/bdev_ftl.o 00:02:12.239 CC module/bdev/malloc/bdev_malloc.o 00:02:12.239 CC module/bdev/virtio/bdev_virtio_rpc.o 00:02:12.239 CC module/bdev/malloc/bdev_malloc_rpc.o 00:02:12.239 CC module/bdev/null/bdev_null_rpc.o 00:02:12.239 CC module/bdev/null/bdev_null.o 00:02:12.239 CC module/bdev/raid/concat.o 00:02:12.239 CC module/bdev/aio/bdev_aio.o 00:02:12.239 CC module/bdev/ftl/bdev_ftl_rpc.o 00:02:12.239 CC module/bdev/raid/raid1.o 00:02:12.239 CC module/bdev/aio/bdev_aio_rpc.o 00:02:12.239 CC module/bdev/iscsi/bdev_iscsi.o 00:02:12.239 CC module/bdev/iscsi/bdev_iscsi_rpc.o 00:02:12.239 CC module/bdev/crypto/vbdev_crypto.o 00:02:12.239 CC module/bdev/crypto/vbdev_crypto_rpc.o 00:02:12.497 SYMLINK libspdk_sock_posix.so 00:02:12.497 SYMLINK libspdk_accel_dpdk_compressdev.so 00:02:12.497 LIB libspdk_blobfs_bdev.a 00:02:12.756 SO libspdk_blobfs_bdev.so.6.0 00:02:12.756 LIB libspdk_bdev_split.a 00:02:12.756 LIB libspdk_bdev_null.a 00:02:12.756 SYMLINK libspdk_blobfs_bdev.so 00:02:12.756 LIB libspdk_bdev_gpt.a 00:02:12.756 LIB libspdk_bdev_passthru.a 00:02:12.756 LIB libspdk_bdev_ftl.a 00:02:12.756 SO libspdk_bdev_split.so.6.0 00:02:12.756 SO libspdk_bdev_null.so.6.0 00:02:12.756 LIB libspdk_bdev_error.a 00:02:12.756 SO libspdk_bdev_gpt.so.6.0 00:02:12.756 SO libspdk_bdev_ftl.so.6.0 00:02:12.756 SO libspdk_bdev_passthru.so.6.0 00:02:12.756 SO libspdk_bdev_error.so.6.0 00:02:12.756 LIB libspdk_bdev_delay.a 00:02:12.756 LIB libspdk_bdev_aio.a 00:02:12.756 SYMLINK libspdk_bdev_split.so 00:02:12.756 LIB libspdk_bdev_zone_block.a 00:02:12.756 SYMLINK libspdk_bdev_null.so 00:02:12.756 LIB libspdk_bdev_crypto.a 00:02:12.756 SO libspdk_bdev_delay.so.6.0 00:02:12.756 SYMLINK libspdk_bdev_gpt.so 00:02:12.756 SO libspdk_bdev_aio.so.6.0 00:02:12.756 SYMLINK libspdk_bdev_ftl.so 00:02:12.756 SYMLINK libspdk_bdev_passthru.so 00:02:12.756 LIB libspdk_bdev_iscsi.a 00:02:12.756 LIB libspdk_bdev_compress.a 00:02:12.756 SO libspdk_bdev_zone_block.so.6.0 00:02:12.756 SO libspdk_bdev_crypto.so.6.0 00:02:12.756 SYMLINK libspdk_bdev_error.so 00:02:12.756 LIB libspdk_bdev_malloc.a 00:02:12.756 SO libspdk_bdev_compress.so.6.0 00:02:12.756 SO libspdk_bdev_iscsi.so.6.0 00:02:12.756 SO libspdk_bdev_malloc.so.6.0 00:02:12.756 SYMLINK libspdk_bdev_delay.so 00:02:12.756 LIB libspdk_accel_dpdk_cryptodev.a 00:02:13.015 SYMLINK libspdk_bdev_aio.so 00:02:13.015 SO libspdk_accel_dpdk_cryptodev.so.3.0 00:02:13.015 SYMLINK libspdk_bdev_zone_block.so 00:02:13.015 SYMLINK libspdk_bdev_crypto.so 00:02:13.015 SYMLINK libspdk_bdev_compress.so 00:02:13.015 SYMLINK libspdk_bdev_iscsi.so 00:02:13.015 SYMLINK libspdk_bdev_malloc.so 00:02:13.015 LIB libspdk_bdev_virtio.a 00:02:13.015 SYMLINK libspdk_accel_dpdk_cryptodev.so 00:02:13.015 LIB libspdk_bdev_lvol.a 00:02:13.015 SO libspdk_bdev_virtio.so.6.0 00:02:13.015 SO libspdk_bdev_lvol.so.6.0 00:02:13.015 SYMLINK libspdk_bdev_virtio.so 00:02:13.273 SYMLINK libspdk_bdev_lvol.so 00:02:13.532 LIB libspdk_bdev_raid.a 00:02:13.532 SO libspdk_bdev_raid.so.6.0 00:02:13.532 SYMLINK libspdk_bdev_raid.so 00:02:14.908 LIB libspdk_bdev_nvme.a 00:02:14.908 SO libspdk_bdev_nvme.so.7.0 00:02:14.908 SYMLINK libspdk_bdev_nvme.so 00:02:15.476 CC module/event/subsystems/keyring/keyring.o 00:02:15.476 CC module/event/subsystems/vmd/vmd.o 00:02:15.476 CC module/event/subsystems/vmd/vmd_rpc.o 00:02:15.476 CC module/event/subsystems/iobuf/iobuf.o 00:02:15.476 CC module/event/subsystems/iobuf/iobuf_rpc.o 00:02:15.476 CC module/event/subsystems/vhost_blk/vhost_blk.o 00:02:15.476 CC module/event/subsystems/scheduler/scheduler.o 00:02:15.476 CC module/event/subsystems/sock/sock.o 00:02:15.735 LIB libspdk_event_keyring.a 00:02:15.735 LIB libspdk_event_scheduler.a 00:02:15.735 LIB libspdk_event_vmd.a 00:02:15.735 LIB libspdk_event_vhost_blk.a 00:02:15.735 LIB libspdk_event_sock.a 00:02:15.735 SO libspdk_event_keyring.so.1.0 00:02:15.735 LIB libspdk_event_iobuf.a 00:02:15.735 SO libspdk_event_scheduler.so.4.0 00:02:15.735 SO libspdk_event_sock.so.5.0 00:02:15.735 SO libspdk_event_vmd.so.6.0 00:02:15.735 SO libspdk_event_vhost_blk.so.3.0 00:02:15.735 SO libspdk_event_iobuf.so.3.0 00:02:15.735 SYMLINK libspdk_event_keyring.so 00:02:15.735 SYMLINK libspdk_event_scheduler.so 00:02:15.735 SYMLINK libspdk_event_sock.so 00:02:15.735 SYMLINK libspdk_event_vhost_blk.so 00:02:15.735 SYMLINK libspdk_event_vmd.so 00:02:15.735 SYMLINK libspdk_event_iobuf.so 00:02:15.993 CC module/event/subsystems/accel/accel.o 00:02:16.251 LIB libspdk_event_accel.a 00:02:16.251 SO libspdk_event_accel.so.6.0 00:02:16.251 SYMLINK libspdk_event_accel.so 00:02:16.819 CC module/event/subsystems/bdev/bdev.o 00:02:16.819 LIB libspdk_event_bdev.a 00:02:16.819 SO libspdk_event_bdev.so.6.0 00:02:16.819 SYMLINK libspdk_event_bdev.so 00:02:17.387 CC module/event/subsystems/ublk/ublk.o 00:02:17.387 CC module/event/subsystems/nbd/nbd.o 00:02:17.387 CC module/event/subsystems/scsi/scsi.o 00:02:17.387 CC module/event/subsystems/nvmf/nvmf_rpc.o 00:02:17.387 CC module/event/subsystems/nvmf/nvmf_tgt.o 00:02:17.387 LIB libspdk_event_nbd.a 00:02:17.387 SO libspdk_event_nbd.so.6.0 00:02:17.387 LIB libspdk_event_ublk.a 00:02:17.387 LIB libspdk_event_scsi.a 00:02:17.387 SO libspdk_event_ublk.so.3.0 00:02:17.387 SYMLINK libspdk_event_nbd.so 00:02:17.387 SO libspdk_event_scsi.so.6.0 00:02:17.387 LIB libspdk_event_nvmf.a 00:02:17.387 SYMLINK libspdk_event_ublk.so 00:02:17.646 SO libspdk_event_nvmf.so.6.0 00:02:17.646 SYMLINK libspdk_event_scsi.so 00:02:17.646 SYMLINK libspdk_event_nvmf.so 00:02:17.904 CC module/event/subsystems/vhost_scsi/vhost_scsi.o 00:02:17.904 CC module/event/subsystems/iscsi/iscsi.o 00:02:17.904 LIB libspdk_event_vhost_scsi.a 00:02:17.904 SO libspdk_event_vhost_scsi.so.3.0 00:02:18.163 LIB libspdk_event_iscsi.a 00:02:18.163 SO libspdk_event_iscsi.so.6.0 00:02:18.163 SYMLINK libspdk_event_vhost_scsi.so 00:02:18.163 SYMLINK libspdk_event_iscsi.so 00:02:18.422 SO libspdk.so.6.0 00:02:18.422 SYMLINK libspdk.so 00:02:18.680 CC app/trace_record/trace_record.o 00:02:18.680 CXX app/trace/trace.o 00:02:18.680 CC app/spdk_nvme_identify/identify.o 00:02:18.680 CC app/spdk_lspci/spdk_lspci.o 00:02:18.680 CC test/rpc_client/rpc_client_test.o 00:02:18.680 CC app/spdk_nvme_perf/perf.o 00:02:18.680 CC app/spdk_nvme_discover/discovery_aer.o 00:02:18.680 CC app/spdk_top/spdk_top.o 00:02:18.680 TEST_HEADER include/spdk/accel.h 00:02:18.680 TEST_HEADER include/spdk/accel_module.h 00:02:18.680 TEST_HEADER include/spdk/assert.h 00:02:18.680 TEST_HEADER include/spdk/base64.h 00:02:18.680 TEST_HEADER include/spdk/barrier.h 00:02:18.680 TEST_HEADER include/spdk/bdev.h 00:02:18.680 TEST_HEADER include/spdk/bdev_module.h 00:02:18.680 TEST_HEADER include/spdk/bdev_zone.h 00:02:18.680 TEST_HEADER include/spdk/bit_array.h 00:02:18.680 TEST_HEADER include/spdk/bit_pool.h 00:02:18.680 TEST_HEADER include/spdk/blob_bdev.h 00:02:18.680 TEST_HEADER include/spdk/blobfs_bdev.h 00:02:18.680 TEST_HEADER include/spdk/conf.h 00:02:18.680 TEST_HEADER include/spdk/blobfs.h 00:02:18.680 TEST_HEADER include/spdk/blob.h 00:02:18.680 TEST_HEADER include/spdk/cpuset.h 00:02:18.680 TEST_HEADER include/spdk/config.h 00:02:18.680 TEST_HEADER include/spdk/crc16.h 00:02:18.680 TEST_HEADER include/spdk/crc32.h 00:02:18.680 TEST_HEADER include/spdk/crc64.h 00:02:18.680 TEST_HEADER include/spdk/dif.h 00:02:18.680 CC examples/interrupt_tgt/interrupt_tgt.o 00:02:18.680 TEST_HEADER include/spdk/dma.h 00:02:18.680 TEST_HEADER include/spdk/endian.h 00:02:18.680 TEST_HEADER include/spdk/env_dpdk.h 00:02:18.680 TEST_HEADER include/spdk/env.h 00:02:18.680 TEST_HEADER include/spdk/event.h 00:02:18.680 TEST_HEADER include/spdk/fd.h 00:02:18.680 TEST_HEADER include/spdk/fd_group.h 00:02:18.680 TEST_HEADER include/spdk/ftl.h 00:02:18.680 TEST_HEADER include/spdk/file.h 00:02:18.680 TEST_HEADER include/spdk/gpt_spec.h 00:02:18.680 TEST_HEADER include/spdk/histogram_data.h 00:02:18.680 TEST_HEADER include/spdk/hexlify.h 00:02:18.680 TEST_HEADER include/spdk/idxd_spec.h 00:02:18.680 TEST_HEADER include/spdk/idxd.h 00:02:18.680 TEST_HEADER include/spdk/init.h 00:02:18.680 TEST_HEADER include/spdk/ioat.h 00:02:18.680 TEST_HEADER include/spdk/ioat_spec.h 00:02:18.680 TEST_HEADER include/spdk/iscsi_spec.h 00:02:18.680 TEST_HEADER include/spdk/json.h 00:02:18.680 CC app/nvmf_tgt/nvmf_main.o 00:02:18.680 CC app/spdk_dd/spdk_dd.o 00:02:18.680 TEST_HEADER include/spdk/jsonrpc.h 00:02:18.680 TEST_HEADER include/spdk/keyring.h 00:02:18.680 TEST_HEADER include/spdk/keyring_module.h 00:02:18.680 TEST_HEADER include/spdk/likely.h 00:02:18.680 TEST_HEADER include/spdk/log.h 00:02:18.680 TEST_HEADER include/spdk/lvol.h 00:02:18.680 TEST_HEADER include/spdk/mmio.h 00:02:18.680 TEST_HEADER include/spdk/memory.h 00:02:18.680 TEST_HEADER include/spdk/nbd.h 00:02:18.680 TEST_HEADER include/spdk/notify.h 00:02:18.680 TEST_HEADER include/spdk/nvme.h 00:02:18.680 TEST_HEADER include/spdk/nvme_ocssd.h 00:02:18.680 TEST_HEADER include/spdk/nvme_ocssd_spec.h 00:02:18.680 TEST_HEADER include/spdk/nvme_intel.h 00:02:18.680 TEST_HEADER include/spdk/nvme_spec.h 00:02:18.680 TEST_HEADER include/spdk/nvme_zns.h 00:02:18.680 TEST_HEADER include/spdk/nvmf_cmd.h 00:02:18.680 CC app/iscsi_tgt/iscsi_tgt.o 00:02:18.680 TEST_HEADER include/spdk/nvmf_fc_spec.h 00:02:18.680 TEST_HEADER include/spdk/nvmf.h 00:02:18.680 TEST_HEADER include/spdk/nvmf_spec.h 00:02:18.680 TEST_HEADER include/spdk/nvmf_transport.h 00:02:18.680 TEST_HEADER include/spdk/opal_spec.h 00:02:18.680 TEST_HEADER include/spdk/opal.h 00:02:18.680 TEST_HEADER include/spdk/pci_ids.h 00:02:18.680 CC app/vhost/vhost.o 00:02:18.680 TEST_HEADER include/spdk/pipe.h 00:02:18.680 TEST_HEADER include/spdk/queue.h 00:02:18.680 TEST_HEADER include/spdk/rpc.h 00:02:18.680 TEST_HEADER include/spdk/reduce.h 00:02:18.680 TEST_HEADER include/spdk/scheduler.h 00:02:18.680 TEST_HEADER include/spdk/sock.h 00:02:18.680 TEST_HEADER include/spdk/scsi.h 00:02:18.680 TEST_HEADER include/spdk/scsi_spec.h 00:02:18.680 TEST_HEADER include/spdk/stdinc.h 00:02:18.680 TEST_HEADER include/spdk/thread.h 00:02:18.680 TEST_HEADER include/spdk/string.h 00:02:18.680 TEST_HEADER include/spdk/trace.h 00:02:18.680 TEST_HEADER include/spdk/trace_parser.h 00:02:18.680 TEST_HEADER include/spdk/ublk.h 00:02:18.680 TEST_HEADER include/spdk/tree.h 00:02:18.680 TEST_HEADER include/spdk/util.h 00:02:18.680 TEST_HEADER include/spdk/uuid.h 00:02:18.680 TEST_HEADER include/spdk/version.h 00:02:18.680 TEST_HEADER include/spdk/vfio_user_pci.h 00:02:18.680 TEST_HEADER include/spdk/vfio_user_spec.h 00:02:18.680 CC app/spdk_tgt/spdk_tgt.o 00:02:18.680 TEST_HEADER include/spdk/vmd.h 00:02:18.680 TEST_HEADER include/spdk/vhost.h 00:02:18.680 TEST_HEADER include/spdk/xor.h 00:02:18.680 TEST_HEADER include/spdk/zipf.h 00:02:18.680 CXX test/cpp_headers/accel.o 00:02:18.943 CXX test/cpp_headers/accel_module.o 00:02:18.943 CXX test/cpp_headers/barrier.o 00:02:18.943 CXX test/cpp_headers/assert.o 00:02:18.943 CXX test/cpp_headers/base64.o 00:02:18.943 CXX test/cpp_headers/bdev.o 00:02:18.943 CXX test/cpp_headers/bdev_zone.o 00:02:18.943 CXX test/cpp_headers/bdev_module.o 00:02:18.943 CXX test/cpp_headers/bit_array.o 00:02:18.943 CXX test/cpp_headers/bit_pool.o 00:02:18.943 CXX test/cpp_headers/blob_bdev.o 00:02:18.943 CXX test/cpp_headers/blobfs_bdev.o 00:02:18.943 CXX test/cpp_headers/blobfs.o 00:02:18.943 CXX test/cpp_headers/blob.o 00:02:18.943 CXX test/cpp_headers/conf.o 00:02:18.943 CXX test/cpp_headers/config.o 00:02:18.943 CXX test/cpp_headers/crc16.o 00:02:18.943 CXX test/cpp_headers/cpuset.o 00:02:18.943 CXX test/cpp_headers/crc32.o 00:02:18.943 CXX test/cpp_headers/crc64.o 00:02:18.943 CXX test/cpp_headers/dif.o 00:02:18.943 CC examples/nvme/reconnect/reconnect.o 00:02:18.943 CC examples/nvme/abort/abort.o 00:02:18.943 CC examples/nvme/hello_world/hello_world.o 00:02:18.943 CC examples/nvme/cmb_copy/cmb_copy.o 00:02:18.943 CC examples/nvme/pmr_persistence/pmr_persistence.o 00:02:18.943 CC test/nvme/e2edp/nvme_dp.o 00:02:18.943 CC test/nvme/aer/aer.o 00:02:18.943 CC examples/nvme/arbitration/arbitration.o 00:02:18.943 CC examples/nvme/hotplug/hotplug.o 00:02:18.943 CC examples/nvme/nvme_manage/nvme_manage.o 00:02:18.943 CC test/nvme/overhead/overhead.o 00:02:18.943 CC test/nvme/simple_copy/simple_copy.o 00:02:18.943 CC examples/util/zipf/zipf.o 00:02:18.943 CC test/app/histogram_perf/histogram_perf.o 00:02:18.943 CC examples/sock/hello_world/hello_sock.o 00:02:18.943 CC test/event/reactor/reactor.o 00:02:18.943 CC examples/ioat/perf/perf.o 00:02:18.943 CC examples/vmd/led/led.o 00:02:18.943 CC examples/ioat/verify/verify.o 00:02:18.943 CC examples/accel/perf/accel_perf.o 00:02:18.943 CC test/nvme/reset/reset.o 00:02:18.943 CC test/nvme/reserve/reserve.o 00:02:18.943 CC examples/idxd/perf/perf.o 00:02:18.943 CXX test/cpp_headers/dma.o 00:02:18.943 CC examples/vmd/lsvmd/lsvmd.o 00:02:18.943 CC test/nvme/sgl/sgl.o 00:02:18.943 CC test/env/vtophys/vtophys.o 00:02:18.943 CC test/nvme/boot_partition/boot_partition.o 00:02:18.943 CC test/nvme/doorbell_aers/doorbell_aers.o 00:02:18.943 CC test/event/event_perf/event_perf.o 00:02:18.943 CC test/nvme/fused_ordering/fused_ordering.o 00:02:18.943 CC test/nvme/startup/startup.o 00:02:18.943 CC test/app/stub/stub.o 00:02:18.943 CC test/nvme/cuse/cuse.o 00:02:18.943 CC test/event/reactor_perf/reactor_perf.o 00:02:18.943 CC test/event/app_repeat/app_repeat.o 00:02:18.943 CC test/app/jsoncat/jsoncat.o 00:02:18.943 CC test/thread/poller_perf/poller_perf.o 00:02:18.943 CC test/env/memory/memory_ut.o 00:02:18.943 CC app/fio/nvme/fio_plugin.o 00:02:18.943 CC test/nvme/err_injection/err_injection.o 00:02:18.943 CC test/env/env_dpdk_post_init/env_dpdk_post_init.o 00:02:18.943 CC test/nvme/compliance/nvme_compliance.o 00:02:18.943 CC test/nvme/fdp/fdp.o 00:02:18.943 CC test/env/pci/pci_ut.o 00:02:18.943 CC test/dma/test_dma/test_dma.o 00:02:18.943 CC examples/blob/hello_world/hello_blob.o 00:02:18.943 CC test/nvme/connect_stress/connect_stress.o 00:02:18.943 CC examples/blob/cli/blobcli.o 00:02:18.943 CC test/bdev/bdevio/bdevio.o 00:02:18.943 CC test/app/bdev_svc/bdev_svc.o 00:02:18.943 CC examples/bdev/bdevperf/bdevperf.o 00:02:18.943 CC examples/nvmf/nvmf/nvmf.o 00:02:18.943 CC test/blobfs/mkfs/mkfs.o 00:02:18.943 CC test/event/scheduler/scheduler.o 00:02:18.943 CC examples/bdev/hello_world/hello_bdev.o 00:02:18.943 CC test/accel/dif/dif.o 00:02:18.943 CC examples/thread/thread/thread_ex.o 00:02:18.943 CC app/fio/bdev/fio_plugin.o 00:02:19.202 LINK spdk_lspci 00:02:19.203 LINK rpc_client_test 00:02:19.203 LINK interrupt_tgt 00:02:19.203 LINK spdk_nvme_discover 00:02:19.203 CC test/lvol/esnap/esnap.o 00:02:19.203 CC test/env/mem_callbacks/mem_callbacks.o 00:02:19.203 LINK spdk_trace_record 00:02:19.203 CC test/app/fuzz/nvme_fuzz/nvme_fuzz.o 00:02:19.203 LINK vhost 00:02:19.462 LINK lsvmd 00:02:19.462 LINK histogram_perf 00:02:19.462 LINK zipf 00:02:19.462 LINK nvmf_tgt 00:02:19.462 LINK iscsi_tgt 00:02:19.462 LINK vtophys 00:02:19.462 LINK spdk_tgt 00:02:19.462 LINK jsoncat 00:02:19.462 LINK led 00:02:19.462 LINK reactor_perf 00:02:19.462 LINK reactor 00:02:19.462 LINK pmr_persistence 00:02:19.462 LINK app_repeat 00:02:19.462 LINK stub 00:02:19.462 CXX test/cpp_headers/endian.o 00:02:19.462 LINK doorbell_aers 00:02:19.462 CXX test/cpp_headers/env_dpdk.o 00:02:19.462 LINK bdev_svc 00:02:19.462 CXX test/cpp_headers/env.o 00:02:19.462 LINK verify 00:02:19.462 CXX test/cpp_headers/event.o 00:02:19.462 LINK hello_world 00:02:19.462 CXX test/cpp_headers/fd_group.o 00:02:19.462 LINK mkfs 00:02:19.462 LINK event_perf 00:02:19.462 LINK ioat_perf 00:02:19.462 LINK simple_copy 00:02:19.462 LINK poller_perf 00:02:19.462 LINK cmb_copy 00:02:19.462 LINK env_dpdk_post_init 00:02:19.462 CXX test/cpp_headers/fd.o 00:02:19.462 LINK boot_partition 00:02:19.726 LINK startup 00:02:19.726 CXX test/cpp_headers/file.o 00:02:19.726 CXX test/cpp_headers/ftl.o 00:02:19.726 CXX test/cpp_headers/gpt_spec.o 00:02:19.726 LINK fused_ordering 00:02:19.726 LINK err_injection 00:02:19.726 CXX test/cpp_headers/hexlify.o 00:02:19.726 LINK hello_sock 00:02:19.726 CXX test/cpp_headers/histogram_data.o 00:02:19.726 LINK reset 00:02:19.726 LINK aer 00:02:19.726 LINK reserve 00:02:19.726 LINK connect_stress 00:02:19.726 LINK hotplug 00:02:19.726 CC test/app/fuzz/iscsi_fuzz/iscsi_fuzz.o 00:02:19.726 CXX test/cpp_headers/idxd.o 00:02:19.726 LINK overhead 00:02:19.726 CXX test/cpp_headers/idxd_spec.o 00:02:19.726 CXX test/cpp_headers/init.o 00:02:19.726 CXX test/cpp_headers/ioat.o 00:02:19.726 CXX test/cpp_headers/ioat_spec.o 00:02:19.726 LINK reconnect 00:02:19.726 CXX test/cpp_headers/iscsi_spec.o 00:02:19.726 CC test/app/fuzz/vhost_fuzz/vhost_fuzz_rpc.o 00:02:19.726 CXX test/cpp_headers/json.o 00:02:19.726 CXX test/cpp_headers/jsonrpc.o 00:02:19.726 LINK sgl 00:02:19.726 CC test/app/fuzz/vhost_fuzz/vhost_fuzz.o 00:02:19.726 CXX test/cpp_headers/keyring.o 00:02:19.726 CXX test/cpp_headers/keyring_module.o 00:02:19.726 LINK hello_blob 00:02:19.726 LINK test_dma 00:02:19.726 LINK nvme_dp 00:02:19.726 LINK hello_bdev 00:02:19.726 LINK thread 00:02:19.726 LINK scheduler 00:02:19.726 LINK nvmf 00:02:19.726 CXX test/cpp_headers/likely.o 00:02:19.726 LINK spdk_dd 00:02:19.988 LINK spdk_trace 00:02:19.988 LINK nvme_compliance 00:02:19.988 LINK arbitration 00:02:19.988 LINK fdp 00:02:19.988 CXX test/cpp_headers/log.o 00:02:19.988 CXX test/cpp_headers/lvol.o 00:02:19.988 LINK idxd_perf 00:02:19.988 CXX test/cpp_headers/memory.o 00:02:19.988 CXX test/cpp_headers/mmio.o 00:02:19.988 LINK abort 00:02:19.988 CXX test/cpp_headers/nbd.o 00:02:19.988 CXX test/cpp_headers/notify.o 00:02:19.988 CXX test/cpp_headers/nvme.o 00:02:19.988 CXX test/cpp_headers/nvme_intel.o 00:02:19.988 CXX test/cpp_headers/nvme_ocssd.o 00:02:19.988 CXX test/cpp_headers/nvme_ocssd_spec.o 00:02:19.988 CXX test/cpp_headers/nvme_zns.o 00:02:19.988 CXX test/cpp_headers/nvme_spec.o 00:02:19.988 CXX test/cpp_headers/nvmf_cmd.o 00:02:19.988 CXX test/cpp_headers/nvmf_fc_spec.o 00:02:19.988 CXX test/cpp_headers/nvmf.o 00:02:19.988 CXX test/cpp_headers/nvmf_spec.o 00:02:19.988 CXX test/cpp_headers/nvmf_transport.o 00:02:19.988 LINK pci_ut 00:02:19.988 CXX test/cpp_headers/opal.o 00:02:19.988 CXX test/cpp_headers/opal_spec.o 00:02:19.988 CXX test/cpp_headers/pci_ids.o 00:02:19.988 CXX test/cpp_headers/pipe.o 00:02:19.988 LINK bdevio 00:02:19.988 CXX test/cpp_headers/queue.o 00:02:19.988 CXX test/cpp_headers/reduce.o 00:02:19.988 CXX test/cpp_headers/rpc.o 00:02:20.247 CXX test/cpp_headers/scheduler.o 00:02:20.247 CXX test/cpp_headers/scsi.o 00:02:20.247 CXX test/cpp_headers/scsi_spec.o 00:02:20.247 LINK nvme_manage 00:02:20.247 CXX test/cpp_headers/sock.o 00:02:20.247 CXX test/cpp_headers/string.o 00:02:20.247 CXX test/cpp_headers/stdinc.o 00:02:20.247 CXX test/cpp_headers/trace_parser.o 00:02:20.247 CXX test/cpp_headers/trace.o 00:02:20.247 CXX test/cpp_headers/thread.o 00:02:20.247 CXX test/cpp_headers/tree.o 00:02:20.247 LINK dif 00:02:20.247 CXX test/cpp_headers/ublk.o 00:02:20.247 CXX test/cpp_headers/util.o 00:02:20.247 CXX test/cpp_headers/uuid.o 00:02:20.247 CXX test/cpp_headers/version.o 00:02:20.247 CXX test/cpp_headers/vfio_user_pci.o 00:02:20.247 CXX test/cpp_headers/vfio_user_spec.o 00:02:20.247 CXX test/cpp_headers/vhost.o 00:02:20.247 CXX test/cpp_headers/vmd.o 00:02:20.247 CXX test/cpp_headers/xor.o 00:02:20.247 CXX test/cpp_headers/zipf.o 00:02:20.247 LINK accel_perf 00:02:20.247 LINK blobcli 00:02:20.247 LINK nvme_fuzz 00:02:20.247 LINK spdk_bdev 00:02:20.247 LINK spdk_nvme 00:02:20.505 LINK vhost_fuzz 00:02:20.505 LINK spdk_nvme_perf 00:02:20.505 LINK mem_callbacks 00:02:20.505 LINK spdk_nvme_identify 00:02:20.505 LINK bdevperf 00:02:20.763 LINK spdk_top 00:02:21.022 LINK memory_ut 00:02:21.280 LINK cuse 00:02:21.538 LINK iscsi_fuzz 00:02:25.737 LINK esnap 00:02:25.737 00:02:25.737 real 1m24.992s 00:02:25.737 user 18m3.836s 00:02:25.737 sys 4m50.151s 00:02:25.737 15:41:30 make -- common/autotest_common.sh@1125 -- $ xtrace_disable 00:02:25.737 15:41:30 make -- common/autotest_common.sh@10 -- $ set +x 00:02:25.737 ************************************ 00:02:25.737 END TEST make 00:02:25.737 ************************************ 00:02:25.737 15:41:30 -- spdk/autobuild.sh@1 -- $ stop_monitor_resources 00:02:25.737 15:41:30 -- pm/common@29 -- $ signal_monitor_resources TERM 00:02:25.737 15:41:30 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:02:25.737 15:41:30 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:30 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:02:25.737 15:41:30 -- pm/common@44 -- $ pid=2458134 00:02:25.737 15:41:30 -- pm/common@50 -- $ kill -TERM 2458134 00:02:25.737 15:41:30 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:30 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:02:25.737 15:41:30 -- pm/common@44 -- $ pid=2458135 00:02:25.737 15:41:30 -- pm/common@50 -- $ kill -TERM 2458135 00:02:25.737 15:41:30 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:30 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:02:25.737 15:41:30 -- pm/common@44 -- $ pid=2458137 00:02:25.737 15:41:30 -- pm/common@50 -- $ kill -TERM 2458137 00:02:25.737 15:41:30 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:30 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:02:25.737 15:41:30 -- pm/common@44 -- $ pid=2458163 00:02:25.737 15:41:30 -- pm/common@50 -- $ sudo -E kill -TERM 2458163 00:02:25.737 15:41:31 -- spdk/autotest.sh@25 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:02:25.737 15:41:31 -- nvmf/common.sh@7 -- # uname -s 00:02:25.737 15:41:31 -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:02:25.737 15:41:31 -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:02:25.737 15:41:31 -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:02:25.737 15:41:31 -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:02:25.737 15:41:31 -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:02:25.737 15:41:31 -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:02:25.737 15:41:31 -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:02:25.737 15:41:31 -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:02:25.737 15:41:31 -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:02:25.737 15:41:31 -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:02:25.737 15:41:31 -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:02:25.737 15:41:31 -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:02:25.737 15:41:31 -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:02:25.737 15:41:31 -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:02:25.737 15:41:31 -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:02:25.737 15:41:31 -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:02:25.737 15:41:31 -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:02:25.737 15:41:31 -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:02:25.737 15:41:31 -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:02:25.737 15:41:31 -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:02:25.737 15:41:31 -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.737 15:41:31 -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.737 15:41:31 -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.737 15:41:31 -- paths/export.sh@5 -- # export PATH 00:02:25.737 15:41:31 -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:02:25.737 15:41:31 -- nvmf/common.sh@47 -- # : 0 00:02:25.737 15:41:31 -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:02:25.737 15:41:31 -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:02:25.737 15:41:31 -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:02:25.737 15:41:31 -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:02:25.737 15:41:31 -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:02:25.737 15:41:31 -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:02:25.737 15:41:31 -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:02:25.737 15:41:31 -- nvmf/common.sh@51 -- # have_pci_nics=0 00:02:25.737 15:41:31 -- spdk/autotest.sh@27 -- # '[' 0 -ne 0 ']' 00:02:25.737 15:41:31 -- spdk/autotest.sh@32 -- # uname -s 00:02:25.737 15:41:31 -- spdk/autotest.sh@32 -- # '[' Linux = Linux ']' 00:02:25.737 15:41:31 -- spdk/autotest.sh@33 -- # old_core_pattern='|/usr/lib/systemd/systemd-coredump %P %u %g %s %t %c %h' 00:02:25.737 15:41:31 -- spdk/autotest.sh@34 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:25.737 15:41:31 -- spdk/autotest.sh@39 -- # echo '|/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/core-collector.sh %P %s %t' 00:02:25.737 15:41:31 -- spdk/autotest.sh@40 -- # echo /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/coredumps 00:02:25.737 15:41:31 -- spdk/autotest.sh@44 -- # modprobe nbd 00:02:25.737 15:41:31 -- spdk/autotest.sh@46 -- # type -P udevadm 00:02:25.737 15:41:31 -- spdk/autotest.sh@46 -- # udevadm=/usr/sbin/udevadm 00:02:25.737 15:41:31 -- spdk/autotest.sh@48 -- # udevadm_pid=2527227 00:02:25.737 15:41:31 -- spdk/autotest.sh@53 -- # start_monitor_resources 00:02:25.737 15:41:31 -- spdk/autotest.sh@47 -- # /usr/sbin/udevadm monitor --property 00:02:25.737 15:41:31 -- pm/common@17 -- # local monitor 00:02:25.737 15:41:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:31 -- pm/common@21 -- # date +%s 00:02:25.737 15:41:31 -- pm/common@19 -- # for monitor in "${MONITOR_RESOURCES[@]}" 00:02:25.737 15:41:31 -- pm/common@21 -- # date +%s 00:02:25.737 15:41:31 -- pm/common@25 -- # sleep 1 00:02:25.737 15:41:31 -- pm/common@21 -- # date +%s 00:02:25.737 15:41:31 -- pm/common@21 -- # date +%s 00:02:25.737 15:41:31 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718026891 00:02:25.737 15:41:31 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718026891 00:02:25.737 15:41:31 -- pm/common@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718026891 00:02:25.737 15:41:31 -- pm/common@21 -- # sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autotest.sh.1718026891 00:02:25.737 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718026891_collect-cpu-temp.pm.log 00:02:25.737 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718026891_collect-vmstat.pm.log 00:02:25.737 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718026891_collect-cpu-load.pm.log 00:02:25.737 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autotest.sh.1718026891_collect-bmc-pm.bmc.pm.log 00:02:26.733 15:41:32 -- spdk/autotest.sh@55 -- # trap 'autotest_cleanup || :; exit 1' SIGINT SIGTERM EXIT 00:02:26.733 15:41:32 -- spdk/autotest.sh@57 -- # timing_enter autotest 00:02:26.733 15:41:32 -- common/autotest_common.sh@723 -- # xtrace_disable 00:02:26.733 15:41:32 -- common/autotest_common.sh@10 -- # set +x 00:02:26.733 15:41:32 -- spdk/autotest.sh@59 -- # create_test_list 00:02:26.733 15:41:32 -- common/autotest_common.sh@747 -- # xtrace_disable 00:02:26.733 15:41:32 -- common/autotest_common.sh@10 -- # set +x 00:02:26.733 15:41:32 -- spdk/autotest.sh@61 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/autotest.sh 00:02:26.733 15:41:32 -- spdk/autotest.sh@61 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.733 15:41:32 -- spdk/autotest.sh@61 -- # src=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.733 15:41:32 -- spdk/autotest.sh@62 -- # out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:02:26.733 15:41:32 -- spdk/autotest.sh@63 -- # cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:02:26.733 15:41:32 -- spdk/autotest.sh@65 -- # freebsd_update_contigmem_mod 00:02:26.733 15:41:32 -- common/autotest_common.sh@1454 -- # uname 00:02:26.733 15:41:32 -- common/autotest_common.sh@1454 -- # '[' Linux = FreeBSD ']' 00:02:26.733 15:41:32 -- spdk/autotest.sh@66 -- # freebsd_set_maxsock_buf 00:02:26.733 15:41:32 -- common/autotest_common.sh@1474 -- # uname 00:02:26.733 15:41:32 -- common/autotest_common.sh@1474 -- # [[ Linux = FreeBSD ]] 00:02:26.733 15:41:32 -- spdk/autotest.sh@71 -- # grep CC_TYPE mk/cc.mk 00:02:26.733 15:41:32 -- spdk/autotest.sh@71 -- # CC_TYPE=CC_TYPE=gcc 00:02:26.733 15:41:32 -- spdk/autotest.sh@72 -- # hash lcov 00:02:26.733 15:41:32 -- spdk/autotest.sh@72 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:02:26.733 15:41:32 -- spdk/autotest.sh@80 -- # export 'LCOV_OPTS= 00:02:26.733 --rc lcov_branch_coverage=1 00:02:26.733 --rc lcov_function_coverage=1 00:02:26.733 --rc genhtml_branch_coverage=1 00:02:26.733 --rc genhtml_function_coverage=1 00:02:26.733 --rc genhtml_legend=1 00:02:26.733 --rc geninfo_all_blocks=1 00:02:26.733 ' 00:02:26.733 15:41:32 -- spdk/autotest.sh@80 -- # LCOV_OPTS=' 00:02:26.733 --rc lcov_branch_coverage=1 00:02:26.733 --rc lcov_function_coverage=1 00:02:26.733 --rc genhtml_branch_coverage=1 00:02:26.733 --rc genhtml_function_coverage=1 00:02:26.733 --rc genhtml_legend=1 00:02:26.733 --rc geninfo_all_blocks=1 00:02:26.733 ' 00:02:26.733 15:41:32 -- spdk/autotest.sh@81 -- # export 'LCOV=lcov 00:02:26.733 --rc lcov_branch_coverage=1 00:02:26.733 --rc lcov_function_coverage=1 00:02:26.733 --rc genhtml_branch_coverage=1 00:02:26.733 --rc genhtml_function_coverage=1 00:02:26.733 --rc genhtml_legend=1 00:02:26.733 --rc geninfo_all_blocks=1 00:02:26.733 --no-external' 00:02:26.733 15:41:32 -- spdk/autotest.sh@81 -- # LCOV='lcov 00:02:26.733 --rc lcov_branch_coverage=1 00:02:26.733 --rc lcov_function_coverage=1 00:02:26.733 --rc genhtml_branch_coverage=1 00:02:26.733 --rc genhtml_function_coverage=1 00:02:26.733 --rc genhtml_legend=1 00:02:26.733 --rc geninfo_all_blocks=1 00:02:26.733 --no-external' 00:02:26.733 15:41:32 -- spdk/autotest.sh@83 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -v 00:02:26.733 lcov: LCOV version 1.14 00:02:26.733 15:41:32 -- spdk/autotest.sh@85 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -i -t Baseline -d /var/jenkins/workspace/crypto-phy-autotest/spdk -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info 00:02:41.618 /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno:no functions found 00:02:41.618 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/nvme/nvme_stubs.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel_module.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/barrier.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/assert.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/accel.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_zone.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev_module.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/base64.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_array.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bit_pool.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs_bdev.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blobfs.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno:no functions found 00:02:59.711 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/bdev.gcno 00:02:59.711 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/blob_bdev.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc16.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/conf.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/cpuset.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/config.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc32.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/crc64.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dif.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/dma.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/endian.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env_dpdk.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/event.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/env.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd_group.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/fd.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ftl.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/gpt_spec.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/file.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/hexlify.gcno 00:02:59.712 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno:no functions found 00:02:59.712 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/histogram_data.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/idxd_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/init.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ioat_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/iscsi_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/json.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/jsonrpc.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/keyring_module.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/likely.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/log.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/lvol.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/mmio.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/memory.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nbd.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/notify.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_ocssd_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_zns.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvme_intel.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_fc_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_cmd.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_transport.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/nvmf_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/opal_spec.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pipe.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/pci_ids.gcno 00:02:59.972 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno:no functions found 00:02:59.972 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/queue.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/reduce.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/rpc.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scheduler.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi_spec.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/scsi.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/string.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/stdinc.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/sock.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/thread.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/trace_parser.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/tree.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/ublk.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/util.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/version.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_pci.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/uuid.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vhost.gcno 00:03:00.231 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno:no functions found 00:03:00.231 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/xor.gcno 00:03:00.232 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno:no functions found 00:03:00.232 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vfio_user_spec.gcno 00:03:00.232 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno:no functions found 00:03:00.232 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/zipf.gcno 00:03:00.232 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno:no functions found 00:03:00.232 geninfo: WARNING: GCOV did not produce any data for /var/jenkins/workspace/crypto-phy-autotest/spdk/test/cpp_headers/vmd.gcno 00:03:02.767 15:42:07 -- spdk/autotest.sh@89 -- # timing_enter pre_cleanup 00:03:02.767 15:42:07 -- common/autotest_common.sh@723 -- # xtrace_disable 00:03:02.767 15:42:07 -- common/autotest_common.sh@10 -- # set +x 00:03:02.767 15:42:07 -- spdk/autotest.sh@91 -- # rm -f 00:03:02.767 15:42:07 -- spdk/autotest.sh@94 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:06.057 0000:5f:00.0 (1b96 2600): Already using the nvme driver 00:03:06.057 0000:5e:00.0 (8086 0a54): Already using the nvme driver 00:03:06.057 0000:00:04.7 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.6 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.5 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.4 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.3 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.2 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.1 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:00:04.0 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.7 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.6 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.5 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.4 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.3 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.2 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.1 (8086 2021): Already using the ioatdma driver 00:03:06.057 0000:80:04.0 (8086 2021): Already using the ioatdma driver 00:03:06.057 15:42:11 -- spdk/autotest.sh@96 -- # get_zoned_devs 00:03:06.057 15:42:11 -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:03:06.057 15:42:11 -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:03:06.057 15:42:11 -- common/autotest_common.sh@1669 -- # local nvme bdf 00:03:06.057 15:42:11 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:06.057 15:42:11 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:03:06.057 15:42:11 -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:03:06.057 15:42:11 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:06.057 15:42:11 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:06.057 15:42:11 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:06.057 15:42:11 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n2 00:03:06.057 15:42:11 -- common/autotest_common.sh@1661 -- # local device=nvme0n2 00:03:06.057 15:42:11 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n2/queue/zoned ]] 00:03:06.057 15:42:11 -- common/autotest_common.sh@1664 -- # [[ host-managed != none ]] 00:03:06.057 15:42:11 -- common/autotest_common.sh@1673 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:06.057 15:42:11 -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:06.057 15:42:11 -- common/autotest_common.sh@1672 -- # is_block_zoned nvme1n1 00:03:06.057 15:42:11 -- common/autotest_common.sh@1661 -- # local device=nvme1n1 00:03:06.057 15:42:11 -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:06.057 15:42:11 -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:06.057 15:42:11 -- spdk/autotest.sh@98 -- # (( 1 > 0 )) 00:03:06.058 15:42:11 -- spdk/autotest.sh@103 -- # export PCI_BLOCKED=0000:5f:00.0 00:03:06.058 15:42:11 -- spdk/autotest.sh@103 -- # PCI_BLOCKED=0000:5f:00.0 00:03:06.058 15:42:11 -- spdk/autotest.sh@104 -- # export PCI_ZONED=0000:5f:00.0 00:03:06.058 15:42:11 -- spdk/autotest.sh@104 -- # PCI_ZONED=0000:5f:00.0 00:03:06.058 15:42:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:06.058 15:42:11 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:06.058 15:42:11 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme0n1 00:03:06.058 15:42:11 -- scripts/common.sh@378 -- # local block=/dev/nvme0n1 pt 00:03:06.058 15:42:11 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme0n1 00:03:06.317 No valid GPT data, bailing 00:03:06.317 15:42:11 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme0n1 00:03:06.317 15:42:11 -- scripts/common.sh@391 -- # pt= 00:03:06.317 15:42:11 -- scripts/common.sh@392 -- # return 1 00:03:06.317 15:42:11 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme0n1 bs=1M count=1 00:03:06.317 1+0 records in 00:03:06.317 1+0 records out 00:03:06.317 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00569314 s, 184 MB/s 00:03:06.317 15:42:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:06.317 15:42:11 -- spdk/autotest.sh@112 -- # [[ -z 0000:5f:00.0 ]] 00:03:06.317 15:42:11 -- spdk/autotest.sh@112 -- # continue 00:03:06.317 15:42:11 -- spdk/autotest.sh@110 -- # for dev in /dev/nvme*n!(*p*) 00:03:06.317 15:42:11 -- spdk/autotest.sh@112 -- # [[ -z '' ]] 00:03:06.317 15:42:11 -- spdk/autotest.sh@113 -- # block_in_use /dev/nvme1n1 00:03:06.317 15:42:11 -- scripts/common.sh@378 -- # local block=/dev/nvme1n1 pt 00:03:06.317 15:42:11 -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py /dev/nvme1n1 00:03:06.317 No valid GPT data, bailing 00:03:06.317 15:42:11 -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:03:06.317 15:42:11 -- scripts/common.sh@391 -- # pt= 00:03:06.317 15:42:11 -- scripts/common.sh@392 -- # return 1 00:03:06.317 15:42:11 -- spdk/autotest.sh@114 -- # dd if=/dev/zero of=/dev/nvme1n1 bs=1M count=1 00:03:06.317 1+0 records in 00:03:06.317 1+0 records out 00:03:06.317 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00394794 s, 266 MB/s 00:03:06.317 15:42:11 -- spdk/autotest.sh@118 -- # sync 00:03:06.317 15:42:11 -- spdk/autotest.sh@120 -- # xtrace_disable_per_cmd reap_spdk_processes 00:03:06.317 15:42:11 -- common/autotest_common.sh@22 -- # eval 'reap_spdk_processes 12> /dev/null' 00:03:06.317 15:42:11 -- common/autotest_common.sh@22 -- # reap_spdk_processes 00:03:11.634 15:42:16 -- spdk/autotest.sh@124 -- # uname -s 00:03:11.634 15:42:16 -- spdk/autotest.sh@124 -- # '[' Linux = Linux ']' 00:03:11.634 15:42:16 -- spdk/autotest.sh@125 -- # run_test setup.sh /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:11.634 15:42:16 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:11.634 15:42:16 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:11.634 15:42:16 -- common/autotest_common.sh@10 -- # set +x 00:03:11.634 ************************************ 00:03:11.634 START TEST setup.sh 00:03:11.634 ************************************ 00:03:11.634 15:42:16 setup.sh -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/test-setup.sh 00:03:11.634 * Looking for test storage... 00:03:11.634 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:11.634 15:42:16 setup.sh -- setup/test-setup.sh@10 -- # uname -s 00:03:11.634 15:42:16 setup.sh -- setup/test-setup.sh@10 -- # [[ Linux == Linux ]] 00:03:11.634 15:42:16 setup.sh -- setup/test-setup.sh@12 -- # run_test acl /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:11.634 15:42:16 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:11.634 15:42:16 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:11.634 15:42:16 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:11.634 ************************************ 00:03:11.634 START TEST acl 00:03:11.634 ************************************ 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/acl.sh 00:03:11.634 * Looking for test storage... 00:03:11.634 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@10 -- # get_zoned_devs 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1669 -- # local nvme bdf 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n2 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme0n2 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n2/queue/zoned ]] 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ host-managed != none ]] 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1673 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1672 -- # is_block_zoned nvme1n1 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1661 -- # local device=nvme1n1 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:03:11.634 15:42:16 setup.sh.acl -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@12 -- # devs=() 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@12 -- # declare -a devs 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@13 -- # drivers=() 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@13 -- # declare -A drivers 00:03:11.634 15:42:16 setup.sh.acl -- setup/acl.sh@51 -- # setup reset 00:03:11.634 15:42:16 setup.sh.acl -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:11.634 15:42:16 setup.sh.acl -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:14.922 15:42:20 setup.sh.acl -- setup/acl.sh@52 -- # collect_setup_devs 00:03:14.922 15:42:20 setup.sh.acl -- setup/acl.sh@16 -- # local dev driver 00:03:14.922 15:42:20 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:14.922 15:42:20 setup.sh.acl -- setup/acl.sh@15 -- # setup output status 00:03:14.922 15:42:20 setup.sh.acl -- setup/common.sh@9 -- # [[ output == output ]] 00:03:14.922 15:42:20 setup.sh.acl -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:03:18.218 Hugepages 00:03:18.218 node hugesize free / total 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 1048576kB == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 00:03:18.218 Type BDF Vendor Device NUMA Driver Device Block devices 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 2048kB == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.0 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.1 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.2 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.3 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.4 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.5 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.6 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:00:04.7 == *:*:*.* ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.218 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5e:00.0 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@22 -- # devs+=("$dev") 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@22 -- # drivers["$dev"]=nvme 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:5f:00.0 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ nvme == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@21 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@21 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.0 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.1 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.2 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.3 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.4 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.5 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.6 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@19 -- # [[ 0000:80:04.7 == *:*:*.* ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # [[ ioatdma == nvme ]] 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@20 -- # continue 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@18 -- # read -r _ dev _ _ _ driver _ 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@24 -- # (( 1 > 0 )) 00:03:18.477 15:42:23 setup.sh.acl -- setup/acl.sh@54 -- # run_test denied denied 00:03:18.477 15:42:23 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:18.477 15:42:23 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:18.477 15:42:23 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:18.477 ************************************ 00:03:18.477 START TEST denied 00:03:18.477 ************************************ 00:03:18.478 15:42:23 setup.sh.acl.denied -- common/autotest_common.sh@1124 -- # denied 00:03:18.478 15:42:23 setup.sh.acl.denied -- setup/acl.sh@38 -- # PCI_BLOCKED='0000:5f:00.0 0000:5e:00.0' 00:03:18.478 15:42:23 setup.sh.acl.denied -- setup/acl.sh@38 -- # setup output config 00:03:18.478 15:42:23 setup.sh.acl.denied -- setup/acl.sh@39 -- # grep 'Skipping denied controller at 0000:5e:00.0' 00:03:18.478 15:42:23 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ output == output ]] 00:03:18.478 15:42:23 setup.sh.acl.denied -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:22.667 0000:5e:00.0 (8086 0a54): Skipping denied controller at 0000:5e:00.0 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@40 -- # verify 0000:5e:00.0 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@28 -- # local dev driver 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@30 -- # for dev in "$@" 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@31 -- # [[ -e /sys/bus/pci/devices/0000:5e:00.0 ]] 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@32 -- # readlink -f /sys/bus/pci/devices/0000:5e:00.0/driver 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@32 -- # driver=/sys/bus/pci/drivers/nvme 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@33 -- # [[ nvme == \n\v\m\e ]] 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/acl.sh@41 -- # setup reset 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:22.667 15:42:27 setup.sh.acl.denied -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:26.858 00:03:26.858 real 0m8.179s 00:03:26.858 user 0m2.743s 00:03:26.858 sys 0m4.717s 00:03:26.858 15:42:32 setup.sh.acl.denied -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:26.858 15:42:32 setup.sh.acl.denied -- common/autotest_common.sh@10 -- # set +x 00:03:26.858 ************************************ 00:03:26.858 END TEST denied 00:03:26.858 ************************************ 00:03:26.858 15:42:32 setup.sh.acl -- setup/acl.sh@55 -- # run_test allowed allowed 00:03:26.858 15:42:32 setup.sh.acl -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:26.858 15:42:32 setup.sh.acl -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:26.858 15:42:32 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:26.858 ************************************ 00:03:26.858 START TEST allowed 00:03:26.858 ************************************ 00:03:26.858 15:42:32 setup.sh.acl.allowed -- common/autotest_common.sh@1124 -- # allowed 00:03:26.858 15:42:32 setup.sh.acl.allowed -- setup/acl.sh@45 -- # PCI_ALLOWED=0000:5e:00.0 00:03:26.858 15:42:32 setup.sh.acl.allowed -- setup/acl.sh@45 -- # setup output config 00:03:26.858 15:42:32 setup.sh.acl.allowed -- setup/acl.sh@46 -- # grep -E '0000:5e:00.0 .*: nvme -> .*' 00:03:26.858 15:42:32 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ output == output ]] 00:03:26.858 15:42:32 setup.sh.acl.allowed -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:03:31.048 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:31.048 15:42:36 setup.sh.acl.allowed -- setup/acl.sh@47 -- # verify 00:03:31.048 15:42:36 setup.sh.acl.allowed -- setup/acl.sh@28 -- # local dev driver 00:03:31.048 15:42:36 setup.sh.acl.allowed -- setup/acl.sh@48 -- # setup reset 00:03:31.048 15:42:36 setup.sh.acl.allowed -- setup/common.sh@9 -- # [[ reset == output ]] 00:03:31.048 15:42:36 setup.sh.acl.allowed -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:03:35.249 00:03:35.249 real 0m7.962s 00:03:35.249 user 0m2.638s 00:03:35.249 sys 0m4.491s 00:03:35.249 15:42:40 setup.sh.acl.allowed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:35.249 15:42:40 setup.sh.acl.allowed -- common/autotest_common.sh@10 -- # set +x 00:03:35.249 ************************************ 00:03:35.249 END TEST allowed 00:03:35.249 ************************************ 00:03:35.249 00:03:35.249 real 0m23.332s 00:03:35.249 user 0m8.064s 00:03:35.249 sys 0m13.911s 00:03:35.249 15:42:40 setup.sh.acl -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:35.249 15:42:40 setup.sh.acl -- common/autotest_common.sh@10 -- # set +x 00:03:35.249 ************************************ 00:03:35.249 END TEST acl 00:03:35.249 ************************************ 00:03:35.249 15:42:40 setup.sh -- setup/test-setup.sh@13 -- # run_test hugepages /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:35.249 15:42:40 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:35.249 15:42:40 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:35.249 15:42:40 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:03:35.249 ************************************ 00:03:35.249 START TEST hugepages 00:03:35.249 ************************************ 00:03:35.249 15:42:40 setup.sh.hugepages -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/hugepages.sh 00:03:35.249 * Looking for test storage... 00:03:35.249 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@10 -- # nodes_sys=() 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@10 -- # declare -a nodes_sys 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@12 -- # declare -i default_hugepages=0 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@13 -- # declare -i no_nodes=0 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@14 -- # declare -i nr_hugepages=0 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@16 -- # get_meminfo Hugepagesize 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@17 -- # local get=Hugepagesize 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@18 -- # local node= 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@19 -- # local var val 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@20 -- # local mem_f mem 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@28 -- # mapfile -t mem 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.249 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 75411000 kB' 'MemAvailable: 78838140 kB' 'Buffers: 2696 kB' 'Cached: 10063720 kB' 'SwapCached: 0 kB' 'Active: 7038224 kB' 'Inactive: 3520928 kB' 'Active(anon): 6647180 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 496496 kB' 'Mapped: 164216 kB' 'Shmem: 6154444 kB' 'KReclaimable: 222792 kB' 'Slab: 688500 kB' 'SReclaimable: 222792 kB' 'SUnreclaim: 465708 kB' 'KernelStack: 19712 kB' 'PageTables: 8528 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 52952932 kB' 'Committed_AS: 8062544 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220896 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 2048' 'HugePages_Free: 2048' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 4194304 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.250 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # continue 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # IFS=': ' 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@31 -- # read -r var val _ 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@32 -- # [[ Hugepagesize == \H\u\g\e\p\a\g\e\s\i\z\e ]] 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@33 -- # echo 2048 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/common.sh@33 -- # return 0 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@16 -- # default_hugepages=2048 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@17 -- # default_huge_nr=/sys/kernel/mm/hugepages/hugepages-2048kB/nr_hugepages 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@18 -- # global_huge_nr=/proc/sys/vm/nr_hugepages 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@21 -- # unset -v HUGE_EVEN_ALLOC 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@22 -- # unset -v HUGEMEM 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@23 -- # unset -v HUGENODE 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@24 -- # unset -v NRHUGE 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@207 -- # get_nodes 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@27 -- # local node 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@208 -- # clear_hp 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:35.251 15:42:40 setup.sh.hugepages -- setup/hugepages.sh@210 -- # run_test default_setup default_setup 00:03:35.251 15:42:40 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:35.251 15:42:40 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:35.251 15:42:40 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:35.251 ************************************ 00:03:35.251 START TEST default_setup 00:03:35.251 ************************************ 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1124 -- # default_setup 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@136 -- # get_test_nr_hugepages 2097152 0 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@49 -- # local size=2097152 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@51 -- # shift 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@52 -- # local node_ids 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@62 -- # local user_nodes 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:35.251 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:35.252 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@73 -- # return 0 00:03:35.252 15:42:40 setup.sh.hugepages.default_setup -- setup/hugepages.sh@137 -- # setup output 00:03:35.252 15:42:40 setup.sh.hugepages.default_setup -- setup/common.sh@9 -- # [[ output == output ]] 00:03:35.252 15:42:40 setup.sh.hugepages.default_setup -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:37.822 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:38.390 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:03:38.390 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:03:39.335 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@138 -- # verify_nr_hugepages 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@89 -- # local node 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@90 -- # local sorted_t 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@91 -- # local sorted_s 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@92 -- # local surp 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@93 -- # local resv 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@94 -- # local anon 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77530304 kB' 'MemAvailable: 80957064 kB' 'Buffers: 2696 kB' 'Cached: 10063832 kB' 'SwapCached: 0 kB' 'Active: 7056360 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665316 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513608 kB' 'Mapped: 164500 kB' 'Shmem: 6154556 kB' 'KReclaimable: 222032 kB' 'Slab: 686552 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464520 kB' 'KernelStack: 19776 kB' 'PageTables: 8476 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8079996 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220800 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.335 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.336 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@97 -- # anon=0 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77531876 kB' 'MemAvailable: 80958636 kB' 'Buffers: 2696 kB' 'Cached: 10063836 kB' 'SwapCached: 0 kB' 'Active: 7056612 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665568 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514376 kB' 'Mapped: 164336 kB' 'Shmem: 6154560 kB' 'KReclaimable: 222032 kB' 'Slab: 686544 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464512 kB' 'KernelStack: 19776 kB' 'PageTables: 8484 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8081132 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220784 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.337 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.338 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@99 -- # surp=0 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77533284 kB' 'MemAvailable: 80960044 kB' 'Buffers: 2696 kB' 'Cached: 10063856 kB' 'SwapCached: 0 kB' 'Active: 7056744 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665700 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514460 kB' 'Mapped: 164344 kB' 'Shmem: 6154580 kB' 'KReclaimable: 222032 kB' 'Slab: 686544 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464512 kB' 'KernelStack: 19728 kB' 'PageTables: 8364 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8081152 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220800 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.339 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.340 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@100 -- # resv=0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:39.341 nr_hugepages=1024 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:39.341 resv_hugepages=0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:39.341 surplus_hugepages=0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:39.341 anon_hugepages=0 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node= 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77534224 kB' 'MemAvailable: 80960984 kB' 'Buffers: 2696 kB' 'Cached: 10063876 kB' 'SwapCached: 0 kB' 'Active: 7056884 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665840 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 514452 kB' 'Mapped: 164344 kB' 'Shmem: 6154600 kB' 'KReclaimable: 222032 kB' 'Slab: 686544 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464512 kB' 'KernelStack: 19888 kB' 'PageTables: 8680 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8081176 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220848 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.341 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 1024 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:39.342 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@112 -- # get_nodes 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@27 -- # local node 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@18 -- # local node=0 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@19 -- # local var val 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@20 -- # local mem_f mem 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@28 -- # mapfile -t mem 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28375988 kB' 'MemUsed: 4258640 kB' 'SwapCached: 0 kB' 'Active: 1216784 kB' 'Inactive: 60292 kB' 'Active(anon): 1062892 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143004 kB' 'Mapped: 97764 kB' 'AnonPages: 137300 kB' 'Shmem: 928820 kB' 'KernelStack: 9880 kB' 'PageTables: 3184 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70284 kB' 'Slab: 280284 kB' 'SReclaimable: 70284 kB' 'SUnreclaim: 210000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.343 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.344 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # continue 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # IFS=': ' 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@31 -- # read -r var val _ 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # echo 0 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/common.sh@33 -- # return 0 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:39.604 node0=1024 expecting 1024 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:39.604 00:03:39.604 real 0m4.444s 00:03:39.604 user 0m1.503s 00:03:39.604 sys 0m2.234s 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:39.604 15:42:44 setup.sh.hugepages.default_setup -- common/autotest_common.sh@10 -- # set +x 00:03:39.604 ************************************ 00:03:39.604 END TEST default_setup 00:03:39.604 ************************************ 00:03:39.604 15:42:44 setup.sh.hugepages -- setup/hugepages.sh@211 -- # run_test per_node_1G_alloc per_node_1G_alloc 00:03:39.604 15:42:44 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:39.604 15:42:44 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:39.604 15:42:44 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:39.604 ************************************ 00:03:39.604 START TEST per_node_1G_alloc 00:03:39.604 ************************************ 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1124 -- # per_node_1G_alloc 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@143 -- # local IFS=, 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@145 -- # get_test_nr_hugepages 1048576 0 1 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@50 -- # (( 3 > 1 )) 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@51 -- # shift 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # node_ids=('0' '1') 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 1 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0' '1') 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@69 -- # (( 2 > 0 )) 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=512 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # NRHUGE=512 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # HUGENODE=0,1 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@146 -- # setup output 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:39.604 15:42:44 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:42.900 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:42.900 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:42.900 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:42.900 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # nr_hugepages=1024 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@147 -- # verify_nr_hugepages 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77545908 kB' 'MemAvailable: 80972668 kB' 'Buffers: 2696 kB' 'Cached: 10063976 kB' 'SwapCached: 0 kB' 'Active: 7055576 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664532 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512516 kB' 'Mapped: 163280 kB' 'Shmem: 6154700 kB' 'KReclaimable: 222032 kB' 'Slab: 686416 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464384 kB' 'KernelStack: 19728 kB' 'PageTables: 8276 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8070600 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221008 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.900 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.901 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77546436 kB' 'MemAvailable: 80973196 kB' 'Buffers: 2696 kB' 'Cached: 10063980 kB' 'SwapCached: 0 kB' 'Active: 7055248 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664204 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512216 kB' 'Mapped: 163264 kB' 'Shmem: 6154704 kB' 'KReclaimable: 222032 kB' 'Slab: 686408 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464376 kB' 'KernelStack: 19696 kB' 'PageTables: 8160 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8070616 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221008 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.902 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.903 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77546884 kB' 'MemAvailable: 80973644 kB' 'Buffers: 2696 kB' 'Cached: 10063984 kB' 'SwapCached: 0 kB' 'Active: 7054288 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663244 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511712 kB' 'Mapped: 163184 kB' 'Shmem: 6154708 kB' 'KReclaimable: 222032 kB' 'Slab: 686380 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464348 kB' 'KernelStack: 19696 kB' 'PageTables: 8136 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8070640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221008 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.904 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:42.905 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:42.905 nr_hugepages=1024 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:42.906 resv_hugepages=0 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:42.906 surplus_hugepages=0 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:42.906 anon_hugepages=0 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node= 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77546884 kB' 'MemAvailable: 80973644 kB' 'Buffers: 2696 kB' 'Cached: 10064040 kB' 'SwapCached: 0 kB' 'Active: 7054304 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663260 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511616 kB' 'Mapped: 163184 kB' 'Shmem: 6154764 kB' 'KReclaimable: 222032 kB' 'Slab: 686380 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 464348 kB' 'KernelStack: 19680 kB' 'PageTables: 8088 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8070664 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221008 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.906 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:42.907 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=0 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 29418564 kB' 'MemUsed: 3216064 kB' 'SwapCached: 0 kB' 'Active: 1215756 kB' 'Inactive: 60292 kB' 'Active(anon): 1061864 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143148 kB' 'Mapped: 96876 kB' 'AnonPages: 135992 kB' 'Shmem: 928964 kB' 'KernelStack: 9864 kB' 'PageTables: 3132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70284 kB' 'Slab: 280340 kB' 'SReclaimable: 70284 kB' 'SUnreclaim: 210056 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.908 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:42.909 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@18 -- # local node=1 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@19 -- # local var val 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688336 kB' 'MemFree: 48128320 kB' 'MemUsed: 12560016 kB' 'SwapCached: 0 kB' 'Active: 5838568 kB' 'Inactive: 3460636 kB' 'Active(anon): 5601416 kB' 'Inactive(anon): 0 kB' 'Active(file): 237152 kB' 'Inactive(file): 3460636 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8923612 kB' 'Mapped: 66308 kB' 'AnonPages: 375620 kB' 'Shmem: 5225824 kB' 'KernelStack: 9816 kB' 'PageTables: 4956 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 151748 kB' 'Slab: 406040 kB' 'SReclaimable: 151748 kB' 'SUnreclaim: 254292 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.170 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # continue 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:43.171 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # echo 0 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/common.sh@33 -- # return 0 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:43.172 node0=512 expecting 512 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:43.172 node1=512 expecting 512 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:43.172 00:03:43.172 real 0m3.532s 00:03:43.172 user 0m1.477s 00:03:43.172 sys 0m2.123s 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:43.172 15:42:48 setup.sh.hugepages.per_node_1G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:43.172 ************************************ 00:03:43.172 END TEST per_node_1G_alloc 00:03:43.172 ************************************ 00:03:43.172 15:42:48 setup.sh.hugepages -- setup/hugepages.sh@212 -- # run_test even_2G_alloc even_2G_alloc 00:03:43.172 15:42:48 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:43.172 15:42:48 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:43.172 15:42:48 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:43.172 ************************************ 00:03:43.172 START TEST even_2G_alloc 00:03:43.172 ************************************ 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1124 -- # even_2G_alloc 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@152 -- # get_test_nr_hugepages 2097152 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 512 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # NRHUGE=1024 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # HUGE_EVEN_ALLOC=yes 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@153 -- # setup output 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:43.172 15:42:48 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:46.468 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:46.468 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:46.468 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:46.468 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@154 -- # verify_nr_hugepages 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@89 -- # local node 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77543188 kB' 'MemAvailable: 80969948 kB' 'Buffers: 2696 kB' 'Cached: 10064136 kB' 'SwapCached: 0 kB' 'Active: 7060792 kB' 'Inactive: 3520928 kB' 'Active(anon): 6669748 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 518228 kB' 'Mapped: 163828 kB' 'Shmem: 6154860 kB' 'KReclaimable: 222032 kB' 'Slab: 685792 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463760 kB' 'KernelStack: 20000 kB' 'PageTables: 8460 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8079820 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221124 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.468 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77550836 kB' 'MemAvailable: 80977596 kB' 'Buffers: 2696 kB' 'Cached: 10064136 kB' 'SwapCached: 0 kB' 'Active: 7056020 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664976 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513344 kB' 'Mapped: 163212 kB' 'Shmem: 6154860 kB' 'KReclaimable: 222032 kB' 'Slab: 685796 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463764 kB' 'KernelStack: 19968 kB' 'PageTables: 8432 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073716 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221072 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.469 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:46.470 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77548724 kB' 'MemAvailable: 80975484 kB' 'Buffers: 2696 kB' 'Cached: 10064156 kB' 'SwapCached: 0 kB' 'Active: 7055276 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664232 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512572 kB' 'Mapped: 163148 kB' 'Shmem: 6154880 kB' 'KReclaimable: 222032 kB' 'Slab: 685804 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463772 kB' 'KernelStack: 19824 kB' 'PageTables: 8232 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220992 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.471 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:46.472 nr_hugepages=1024 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:46.472 resv_hugepages=0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:46.472 surplus_hugepages=0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:46.472 anon_hugepages=0 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node= 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77547484 kB' 'MemAvailable: 80974244 kB' 'Buffers: 2696 kB' 'Cached: 10064180 kB' 'SwapCached: 0 kB' 'Active: 7054808 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663764 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512112 kB' 'Mapped: 163208 kB' 'Shmem: 6154904 kB' 'KReclaimable: 222032 kB' 'Slab: 685944 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463912 kB' 'KernelStack: 19744 kB' 'PageTables: 8628 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073760 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220976 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.472 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 1024 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@27 -- # local node 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=0 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 29402672 kB' 'MemUsed: 3231956 kB' 'SwapCached: 0 kB' 'Active: 1216356 kB' 'Inactive: 60292 kB' 'Active(anon): 1062464 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143252 kB' 'Mapped: 96900 kB' 'AnonPages: 136592 kB' 'Shmem: 929068 kB' 'KernelStack: 9864 kB' 'PageTables: 3200 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70284 kB' 'Slab: 279600 kB' 'SReclaimable: 70284 kB' 'SUnreclaim: 209316 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.473 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@18 -- # local node=1 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@19 -- # local var val 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688336 kB' 'MemFree: 48143324 kB' 'MemUsed: 12545012 kB' 'SwapCached: 0 kB' 'Active: 5838692 kB' 'Inactive: 3460636 kB' 'Active(anon): 5601540 kB' 'Inactive(anon): 0 kB' 'Active(file): 237152 kB' 'Inactive(file): 3460636 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8923648 kB' 'Mapped: 66316 kB' 'AnonPages: 376044 kB' 'Shmem: 5225860 kB' 'KernelStack: 10088 kB' 'PageTables: 5716 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 151748 kB' 'Slab: 406344 kB' 'SReclaimable: 151748 kB' 'SUnreclaim: 254596 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.474 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # continue 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # echo 0 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/common.sh@33 -- # return 0 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.475 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:46.735 node0=512 expecting 512 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@128 -- # echo 'node1=512 expecting 512' 00:03:46.735 node1=512 expecting 512 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- setup/hugepages.sh@130 -- # [[ 512 == \5\1\2 ]] 00:03:46.735 00:03:46.735 real 0m3.468s 00:03:46.735 user 0m1.399s 00:03:46.735 sys 0m2.135s 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:46.735 15:42:51 setup.sh.hugepages.even_2G_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:46.735 ************************************ 00:03:46.735 END TEST even_2G_alloc 00:03:46.735 ************************************ 00:03:46.735 15:42:51 setup.sh.hugepages -- setup/hugepages.sh@213 -- # run_test odd_alloc odd_alloc 00:03:46.735 15:42:51 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:46.735 15:42:52 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:46.735 15:42:52 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:46.735 ************************************ 00:03:46.735 START TEST odd_alloc 00:03:46.735 ************************************ 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1124 -- # odd_alloc 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@159 -- # get_test_nr_hugepages 2098176 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@49 -- # local size=2098176 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1025 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1025 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=512 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 513 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=513 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGEMEM=2049 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # HUGE_EVEN_ALLOC=yes 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@160 -- # setup output 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:46.735 15:42:52 setup.sh.hugepages.odd_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:49.270 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:49.529 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:49.529 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:49.529 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:49.530 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@161 -- # verify_nr_hugepages 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@89 -- # local node 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.792 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77544524 kB' 'MemAvailable: 80971284 kB' 'Buffers: 2696 kB' 'Cached: 10064296 kB' 'SwapCached: 0 kB' 'Active: 7055532 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664488 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512176 kB' 'Mapped: 163296 kB' 'Shmem: 6155020 kB' 'KReclaimable: 222032 kB' 'Slab: 685736 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463704 kB' 'KernelStack: 19712 kB' 'PageTables: 8204 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000484 kB' 'Committed_AS: 8071640 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220944 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.793 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77544276 kB' 'MemAvailable: 80971036 kB' 'Buffers: 2696 kB' 'Cached: 10064296 kB' 'SwapCached: 0 kB' 'Active: 7055232 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664188 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512364 kB' 'Mapped: 163212 kB' 'Shmem: 6155020 kB' 'KReclaimable: 222032 kB' 'Slab: 685744 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463712 kB' 'KernelStack: 19696 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000484 kB' 'Committed_AS: 8071656 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220912 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.794 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.795 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77544584 kB' 'MemAvailable: 80971344 kB' 'Buffers: 2696 kB' 'Cached: 10064316 kB' 'SwapCached: 0 kB' 'Active: 7055256 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664212 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512400 kB' 'Mapped: 163212 kB' 'Shmem: 6155040 kB' 'KReclaimable: 222032 kB' 'Slab: 685744 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463712 kB' 'KernelStack: 19712 kB' 'PageTables: 8196 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000484 kB' 'Committed_AS: 8071676 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220912 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.796 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.797 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1025 00:03:49.798 nr_hugepages=1025 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:49.798 resv_hugepages=0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:49.798 surplus_hugepages=0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:49.798 anon_hugepages=0 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@107 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@109 -- # (( 1025 == nr_hugepages )) 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node= 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77544768 kB' 'MemAvailable: 80971528 kB' 'Buffers: 2696 kB' 'Cached: 10064356 kB' 'SwapCached: 0 kB' 'Active: 7054896 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663852 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511988 kB' 'Mapped: 163212 kB' 'Shmem: 6155080 kB' 'KReclaimable: 222032 kB' 'Slab: 685744 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463712 kB' 'KernelStack: 19696 kB' 'PageTables: 8148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54000484 kB' 'Committed_AS: 8071696 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220912 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1025' 'HugePages_Free: 1025' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2099200 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.798 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 1025 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@110 -- # (( 1025 == nr_hugepages + surp + resv )) 00:03:49.799 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@27 -- # local node 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=513 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=0 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 29409384 kB' 'MemUsed: 3225244 kB' 'SwapCached: 0 kB' 'Active: 1216524 kB' 'Inactive: 60292 kB' 'Active(anon): 1062632 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143384 kB' 'Mapped: 96908 kB' 'AnonPages: 136552 kB' 'Shmem: 929200 kB' 'KernelStack: 9832 kB' 'PageTables: 3148 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70284 kB' 'Slab: 279428 kB' 'SReclaimable: 70284 kB' 'SUnreclaim: 209144 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.060 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:50.061 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@18 -- # local node=1 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@19 -- # local var val 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688336 kB' 'MemFree: 48137404 kB' 'MemUsed: 12550932 kB' 'SwapCached: 0 kB' 'Active: 5838620 kB' 'Inactive: 3460636 kB' 'Active(anon): 5601468 kB' 'Inactive(anon): 0 kB' 'Active(file): 237152 kB' 'Inactive(file): 3460636 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8923672 kB' 'Mapped: 66304 kB' 'AnonPages: 375680 kB' 'Shmem: 5225884 kB' 'KernelStack: 9864 kB' 'PageTables: 5000 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 151748 kB' 'Slab: 406316 kB' 'SReclaimable: 151748 kB' 'SUnreclaim: 254568 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 513' 'HugePages_Free: 513' 'HugePages_Surp: 0' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.062 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # continue 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # echo 0 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/common.sh@33 -- # return 0 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 513' 00:03:50.063 node0=512 expecting 513 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@128 -- # echo 'node1=513 expecting 512' 00:03:50.063 node1=513 expecting 512 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- setup/hugepages.sh@130 -- # [[ 512 513 == \5\1\2\ \5\1\3 ]] 00:03:50.063 00:03:50.063 real 0m3.324s 00:03:50.063 user 0m1.322s 00:03:50.063 sys 0m2.056s 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:50.063 15:42:55 setup.sh.hugepages.odd_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:50.063 ************************************ 00:03:50.063 END TEST odd_alloc 00:03:50.063 ************************************ 00:03:50.063 15:42:55 setup.sh.hugepages -- setup/hugepages.sh@214 -- # run_test custom_alloc custom_alloc 00:03:50.063 15:42:55 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:50.063 15:42:55 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:50.063 15:42:55 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:50.063 ************************************ 00:03:50.063 START TEST custom_alloc 00:03:50.063 ************************************ 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1124 -- # custom_alloc 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@167 -- # local IFS=, 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@169 -- # local node 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # nodes_hp=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@170 -- # local nodes_hp 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@172 -- # local nr_hugepages=0 _nr_hugepages=0 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@174 -- # get_test_nr_hugepages 1048576 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=1048576 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=512 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=512 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 0 > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 256 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 1 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@82 -- # nodes_test[_no_nodes - 1]=256 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@83 -- # : 0 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@84 -- # : 0 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@81 -- # (( _no_nodes > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@175 -- # nodes_hp[0]=512 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@176 -- # (( 2 > 1 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@177 -- # get_test_nr_hugepages 2097152 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@50 -- # (( 1 > 1 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 1 > 0 )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@178 -- # nodes_hp[1]=1024 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@181 -- # for node in "${!nodes_hp[@]}" 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@182 -- # HUGENODE+=("nodes_hp[$node]=${nodes_hp[node]}") 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@183 -- # (( _nr_hugepages += nodes_hp[node] )) 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@186 -- # get_test_nr_hugepages_per_node 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # user_nodes=() 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:50.063 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@69 -- # (( 0 > 0 )) 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@74 -- # (( 2 > 0 )) 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=512 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@75 -- # for _no_nodes in "${!nodes_hp[@]}" 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@76 -- # nodes_test[_no_nodes]=1024 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@78 -- # return 0 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # HUGENODE='nodes_hp[0]=512,nodes_hp[1]=1024' 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@187 -- # setup output 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:50.064 15:42:55 setup.sh.hugepages.custom_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:53.358 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:53.358 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:53.358 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:53.358 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # nr_hugepages=1536 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@188 -- # verify_nr_hugepages 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@89 -- # local node 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 76512556 kB' 'MemAvailable: 79939316 kB' 'Buffers: 2696 kB' 'Cached: 10064452 kB' 'SwapCached: 0 kB' 'Active: 7056464 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665420 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513040 kB' 'Mapped: 163312 kB' 'Shmem: 6155176 kB' 'KReclaimable: 222032 kB' 'Slab: 686000 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463968 kB' 'KernelStack: 19696 kB' 'PageTables: 8172 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477220 kB' 'Committed_AS: 8073240 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220960 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.358 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.359 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 76513488 kB' 'MemAvailable: 79940248 kB' 'Buffers: 2696 kB' 'Cached: 10064456 kB' 'SwapCached: 0 kB' 'Active: 7056504 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665460 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513172 kB' 'Mapped: 163348 kB' 'Shmem: 6155180 kB' 'KReclaimable: 222032 kB' 'Slab: 686000 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463968 kB' 'KernelStack: 19648 kB' 'PageTables: 8028 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477220 kB' 'Committed_AS: 8073740 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220832 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.360 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.361 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 76513040 kB' 'MemAvailable: 79939800 kB' 'Buffers: 2696 kB' 'Cached: 10064476 kB' 'SwapCached: 0 kB' 'Active: 7056408 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665364 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512992 kB' 'Mapped: 163348 kB' 'Shmem: 6155200 kB' 'KReclaimable: 222032 kB' 'Slab: 685936 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463904 kB' 'KernelStack: 19664 kB' 'PageTables: 8188 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477220 kB' 'Committed_AS: 8075352 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220960 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.362 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.363 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1536 00:03:53.364 nr_hugepages=1536 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:53.364 resv_hugepages=0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:53.364 surplus_hugepages=0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:53.364 anon_hugepages=0 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@107 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@109 -- # (( 1536 == nr_hugepages )) 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node= 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 76513192 kB' 'MemAvailable: 79939952 kB' 'Buffers: 2696 kB' 'Cached: 10064496 kB' 'SwapCached: 0 kB' 'Active: 7056780 kB' 'Inactive: 3520928 kB' 'Active(anon): 6665736 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 513316 kB' 'Mapped: 163356 kB' 'Shmem: 6155220 kB' 'KReclaimable: 222032 kB' 'Slab: 685936 kB' 'SReclaimable: 222032 kB' 'SUnreclaim: 463904 kB' 'KernelStack: 19920 kB' 'PageTables: 8424 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 53477220 kB' 'Committed_AS: 8073776 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 221008 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1536' 'HugePages_Free: 1536' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 3145728 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.364 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.365 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 1536 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@110 -- # (( 1536 == nr_hugepages + surp + resv )) 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@27 -- # local node 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=512 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=0 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 29423632 kB' 'MemUsed: 3210996 kB' 'SwapCached: 0 kB' 'Active: 1217012 kB' 'Inactive: 60292 kB' 'Active(anon): 1063120 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143504 kB' 'Mapped: 97000 kB' 'AnonPages: 136488 kB' 'Shmem: 929320 kB' 'KernelStack: 9864 kB' 'PageTables: 3228 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70284 kB' 'Slab: 279604 kB' 'SReclaimable: 70284 kB' 'SUnreclaim: 209320 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 512' 'HugePages_Free: 512' 'HugePages_Surp: 0' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.366 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:53.367 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:53.627 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 1 00:03:53.627 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:53.627 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@18 -- # local node=1 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@19 -- # local var val 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node1/meminfo ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node1/meminfo 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 60688336 kB' 'MemFree: 47087004 kB' 'MemUsed: 13601332 kB' 'SwapCached: 0 kB' 'Active: 5840160 kB' 'Inactive: 3460636 kB' 'Active(anon): 5603008 kB' 'Inactive(anon): 0 kB' 'Active(file): 237152 kB' 'Inactive(file): 3460636 kB' 'Unevictable: 0 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 8923708 kB' 'Mapped: 66348 kB' 'AnonPages: 377204 kB' 'Shmem: 5225920 kB' 'KernelStack: 10024 kB' 'PageTables: 5700 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 151748 kB' 'Slab: 406300 kB' 'SReclaimable: 151748 kB' 'SUnreclaim: 254552 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.628 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # continue 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # echo 0 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/common.sh@33 -- # return 0 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node0=512 expecting 512' 00:03:53.629 node0=512 expecting 512 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@128 -- # echo 'node1=1024 expecting 1024' 00:03:53.629 node1=1024 expecting 1024 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- setup/hugepages.sh@130 -- # [[ 512,1024 == \5\1\2\,\1\0\2\4 ]] 00:03:53.629 00:03:53.629 real 0m3.469s 00:03:53.629 user 0m1.423s 00:03:53.629 sys 0m2.109s 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:53.629 15:42:58 setup.sh.hugepages.custom_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:53.629 ************************************ 00:03:53.629 END TEST custom_alloc 00:03:53.629 ************************************ 00:03:53.629 15:42:58 setup.sh.hugepages -- setup/hugepages.sh@215 -- # run_test no_shrink_alloc no_shrink_alloc 00:03:53.629 15:42:58 setup.sh.hugepages -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:03:53.629 15:42:58 setup.sh.hugepages -- common/autotest_common.sh@1106 -- # xtrace_disable 00:03:53.629 15:42:58 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:53.629 ************************************ 00:03:53.629 START TEST no_shrink_alloc 00:03:53.629 ************************************ 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1124 -- # no_shrink_alloc 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@195 -- # get_test_nr_hugepages 2097152 0 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@49 -- # local size=2097152 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@50 -- # (( 2 > 1 )) 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@51 -- # shift 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # node_ids=('0') 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@52 -- # local node_ids 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@55 -- # (( size >= default_hugepages )) 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@57 -- # nr_hugepages=1024 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@58 -- # get_test_nr_hugepages_per_node 0 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # user_nodes=('0') 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@62 -- # local user_nodes 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@64 -- # local _nr_hugepages=1024 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@65 -- # local _no_nodes=2 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # nodes_test=() 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@67 -- # local -g nodes_test 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@69 -- # (( 1 > 0 )) 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@70 -- # for _no_nodes in "${user_nodes[@]}" 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@71 -- # nodes_test[_no_nodes]=1024 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@73 -- # return 0 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@198 -- # setup output 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:53.629 15:42:58 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:56.201 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:56.464 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:56.464 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:56.464 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@199 -- # verify_nr_hugepages 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77576644 kB' 'MemAvailable: 81003388 kB' 'Buffers: 2696 kB' 'Cached: 10064612 kB' 'SwapCached: 0 kB' 'Active: 7055204 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664160 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511592 kB' 'Mapped: 163308 kB' 'Shmem: 6155336 kB' 'KReclaimable: 222000 kB' 'Slab: 686480 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 464480 kB' 'KernelStack: 19760 kB' 'PageTables: 8324 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073024 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220912 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.464 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.465 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77576276 kB' 'MemAvailable: 81003020 kB' 'Buffers: 2696 kB' 'Cached: 10064616 kB' 'SwapCached: 0 kB' 'Active: 7054880 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663836 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511804 kB' 'Mapped: 163292 kB' 'Shmem: 6155340 kB' 'KReclaimable: 222000 kB' 'Slab: 686560 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 464560 kB' 'KernelStack: 19760 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073044 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220880 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.466 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.467 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77575268 kB' 'MemAvailable: 81002012 kB' 'Buffers: 2696 kB' 'Cached: 10064632 kB' 'SwapCached: 0 kB' 'Active: 7054916 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663872 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511808 kB' 'Mapped: 163292 kB' 'Shmem: 6155356 kB' 'KReclaimable: 222000 kB' 'Slab: 686560 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 464560 kB' 'KernelStack: 19760 kB' 'PageTables: 8336 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073064 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220880 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.468 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.729 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:56.730 nr_hugepages=1024 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:56.730 resv_hugepages=0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:56.730 surplus_hugepages=0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:56.730 anon_hugepages=0 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.730 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77575268 kB' 'MemAvailable: 81002012 kB' 'Buffers: 2696 kB' 'Cached: 10064672 kB' 'SwapCached: 0 kB' 'Active: 7054580 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663536 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511400 kB' 'Mapped: 163292 kB' 'Shmem: 6155396 kB' 'KReclaimable: 222000 kB' 'Slab: 686560 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 464560 kB' 'KernelStack: 19744 kB' 'PageTables: 8288 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073088 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220880 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:01 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.731 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:56.732 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28395660 kB' 'MemUsed: 4238968 kB' 'SwapCached: 0 kB' 'Active: 1215420 kB' 'Inactive: 60292 kB' 'Active(anon): 1061528 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143556 kB' 'Mapped: 96928 kB' 'AnonPages: 135268 kB' 'Shmem: 929372 kB' 'KernelStack: 9880 kB' 'PageTables: 3240 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70252 kB' 'Slab: 280252 kB' 'SReclaimable: 70252 kB' 'SUnreclaim: 210000 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.733 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:56.734 node0=1024 expecting 1024 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # CLEAR_HUGE=no 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # NRHUGE=512 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@202 -- # setup output 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@9 -- # [[ output == output ]] 00:03:56.734 15:43:02 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:03:59.270 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:03:59.529 0000:00:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:5e:00.0 (8086 0a54): Already using the vfio-pci driver 00:03:59.529 0000:00:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:00:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.7 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.6 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.5 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.4 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.3 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.2 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.1 (8086 2021): Already using the vfio-pci driver 00:03:59.529 0000:80:04.0 (8086 2021): Already using the vfio-pci driver 00:03:59.529 INFO: Requested 512 hugepages but 1024 already allocated on node0 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@204 -- # verify_nr_hugepages 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@89 -- # local node 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@90 -- # local sorted_t 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@91 -- # local sorted_s 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@92 -- # local surp 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@93 -- # local resv 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@94 -- # local anon 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@96 -- # [[ always [madvise] never != *\[\n\e\v\e\r\]* ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # get_meminfo AnonHugePages 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=AnonHugePages 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77580312 kB' 'MemAvailable: 81007056 kB' 'Buffers: 2696 kB' 'Cached: 10064736 kB' 'SwapCached: 0 kB' 'Active: 7055032 kB' 'Inactive: 3520928 kB' 'Active(anon): 6663988 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511724 kB' 'Mapped: 163236 kB' 'Shmem: 6155460 kB' 'KReclaimable: 222000 kB' 'Slab: 685708 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 463708 kB' 'KernelStack: 19712 kB' 'PageTables: 8132 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073452 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220896 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.793 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.794 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \A\n\o\n\H\u\g\e\P\a\g\e\s ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@97 -- # anon=0 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # get_meminfo HugePages_Surp 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77580524 kB' 'MemAvailable: 81007268 kB' 'Buffers: 2696 kB' 'Cached: 10064740 kB' 'SwapCached: 0 kB' 'Active: 7055216 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664172 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511928 kB' 'Mapped: 163236 kB' 'Shmem: 6155464 kB' 'KReclaimable: 222000 kB' 'Slab: 685764 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 463764 kB' 'KernelStack: 19696 kB' 'PageTables: 8084 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073604 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220880 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.795 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.796 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@99 -- # surp=0 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # get_meminfo HugePages_Rsvd 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Rsvd 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77580016 kB' 'MemAvailable: 81006760 kB' 'Buffers: 2696 kB' 'Cached: 10064756 kB' 'SwapCached: 0 kB' 'Active: 7055140 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664096 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 511840 kB' 'Mapped: 163180 kB' 'Shmem: 6155480 kB' 'KReclaimable: 222000 kB' 'Slab: 685764 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 463764 kB' 'KernelStack: 19696 kB' 'PageTables: 8072 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8073628 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220848 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.797 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.798 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Rsvd == \H\u\g\e\P\a\g\e\s\_\R\s\v\d ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@100 -- # resv=0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@102 -- # echo nr_hugepages=1024 00:03:59.799 nr_hugepages=1024 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@103 -- # echo resv_hugepages=0 00:03:59.799 resv_hugepages=0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@104 -- # echo surplus_hugepages=0 00:03:59.799 surplus_hugepages=0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@105 -- # echo anon_hugepages=0 00:03:59.799 anon_hugepages=0 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@107 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@109 -- # (( 1024 == nr_hugepages )) 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # get_meminfo HugePages_Total 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Total 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node= 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node/meminfo ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@25 -- # [[ -n '' ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 93322964 kB' 'MemFree: 77580856 kB' 'MemAvailable: 81007600 kB' 'Buffers: 2696 kB' 'Cached: 10064792 kB' 'SwapCached: 0 kB' 'Active: 7055464 kB' 'Inactive: 3520928 kB' 'Active(anon): 6664420 kB' 'Inactive(anon): 0 kB' 'Active(file): 391044 kB' 'Inactive(file): 3520928 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'SwapTotal: 8388604 kB' 'SwapFree: 8388604 kB' 'Zswap: 0 kB' 'Zswapped: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'AnonPages: 512236 kB' 'Mapped: 163180 kB' 'Shmem: 6155516 kB' 'KReclaimable: 222000 kB' 'Slab: 685760 kB' 'SReclaimable: 222000 kB' 'SUnreclaim: 463760 kB' 'KernelStack: 19744 kB' 'PageTables: 8284 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'CommitLimit: 54001508 kB' 'Committed_AS: 8074020 kB' 'VmallocTotal: 34359738367 kB' 'VmallocUsed: 220848 kB' 'VmallocChunk: 0 kB' 'Percpu: 66432 kB' 'HardwareCorrupted: 0 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'CmaTotal: 0 kB' 'CmaFree: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Rsvd: 0' 'HugePages_Surp: 0' 'Hugepagesize: 2048 kB' 'Hugetlb: 2097152 kB' 'DirectMap4k: 2106324 kB' 'DirectMap2M: 14350336 kB' 'DirectMap1G: 85983232 kB' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemAvailable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Buffers == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Cached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.799 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswap == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Zswapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CommitLimit == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Committed_AS == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocUsed == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ VmallocChunk == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Percpu == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HardwareCorrupted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.800 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaTotal == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ CmaFree == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\T\o\t\a\l ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 1024 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@110 -- # (( 1024 == nr_hugepages + surp + resv )) 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@112 -- # get_nodes 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@27 -- # local node 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=1024 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@29 -- # for node in /sys/devices/system/node/node+([0-9]) 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@30 -- # nodes_sys[${node##*node}]=0 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@32 -- # no_nodes=2 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@33 -- # (( no_nodes > 0 )) 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@115 -- # for node in "${!nodes_test[@]}" 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@116 -- # (( nodes_test[node] += resv )) 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # get_meminfo HugePages_Surp 0 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@17 -- # local get=HugePages_Surp 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@18 -- # local node=0 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@19 -- # local var val 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@20 -- # local mem_f mem 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@22 -- # mem_f=/proc/meminfo 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@23 -- # [[ -e /sys/devices/system/node/node0/meminfo ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@24 -- # mem_f=/sys/devices/system/node/node0/meminfo 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@28 -- # mapfile -t mem 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@29 -- # mem=("${mem[@]#Node +([0-9]) }") 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@16 -- # printf '%s\n' 'MemTotal: 32634628 kB' 'MemFree: 28398156 kB' 'MemUsed: 4236472 kB' 'SwapCached: 0 kB' 'Active: 1215520 kB' 'Inactive: 60292 kB' 'Active(anon): 1061628 kB' 'Inactive(anon): 0 kB' 'Active(file): 153892 kB' 'Inactive(file): 60292 kB' 'Unevictable: 3072 kB' 'Mlocked: 0 kB' 'Dirty: 0 kB' 'Writeback: 0 kB' 'FilePages: 1143624 kB' 'Mapped: 96936 kB' 'AnonPages: 135328 kB' 'Shmem: 929440 kB' 'KernelStack: 9848 kB' 'PageTables: 3152 kB' 'SecPageTables: 0 kB' 'NFS_Unstable: 0 kB' 'Bounce: 0 kB' 'WritebackTmp: 0 kB' 'KReclaimable: 70252 kB' 'Slab: 280128 kB' 'SReclaimable: 70252 kB' 'SUnreclaim: 209876 kB' 'AnonHugePages: 0 kB' 'ShmemHugePages: 0 kB' 'ShmemPmdMapped: 0 kB' 'FileHugePages: 0 kB' 'FilePmdMapped: 0 kB' 'Unaccepted: 0 kB' 'HugePages_Total: 1024' 'HugePages_Free: 1024' 'HugePages_Surp: 0' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemTotal == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemFree == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ MemUsed == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SwapCached == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(anon) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Active(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Inactive(file) == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unevictable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mlocked == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Dirty == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.801 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Writeback == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Mapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonPages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Shmem == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KernelStack == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ PageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SecPageTables == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ NFS_Unstable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Bounce == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ WritebackTmp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ KReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Slab == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SReclaimable == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ SUnreclaim == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ AnonHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ ShmemPmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FileHugePages == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ FilePmdMapped == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ Unaccepted == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Total == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Free == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # continue 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # IFS=': ' 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@31 -- # read -r var val _ 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@32 -- # [[ HugePages_Surp == \H\u\g\e\P\a\g\e\s\_\S\u\r\p ]] 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # echo 0 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/common.sh@33 -- # return 0 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@117 -- # (( nodes_test[node] += 0 )) 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@126 -- # for node in "${!nodes_test[@]}" 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_t[nodes_test[node]]=1 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@127 -- # sorted_s[nodes_sys[node]]=1 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@128 -- # echo 'node0=1024 expecting 1024' 00:03:59.802 node0=1024 expecting 1024 00:03:59.802 15:43:05 setup.sh.hugepages.no_shrink_alloc -- setup/hugepages.sh@130 -- # [[ 1024 == \1\0\2\4 ]] 00:03:59.802 00:03:59.803 real 0m6.293s 00:03:59.803 user 0m2.442s 00:03:59.803 sys 0m3.778s 00:03:59.803 15:43:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:59.803 15:43:05 setup.sh.hugepages.no_shrink_alloc -- common/autotest_common.sh@10 -- # set +x 00:03:59.803 ************************************ 00:03:59.803 END TEST no_shrink_alloc 00:03:59.803 ************************************ 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@217 -- # clear_hp 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@37 -- # local node hp 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@39 -- # for node in "${!nodes_sys[@]}" 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@40 -- # for hp in "/sys/devices/system/node/node$node/hugepages/hugepages-"* 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@41 -- # echo 0 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # export CLEAR_HUGE=yes 00:03:59.803 15:43:05 setup.sh.hugepages -- setup/hugepages.sh@45 -- # CLEAR_HUGE=yes 00:03:59.803 00:03:59.803 real 0m25.074s 00:03:59.803 user 0m9.782s 00:03:59.803 sys 0m14.794s 00:03:59.803 15:43:05 setup.sh.hugepages -- common/autotest_common.sh@1125 -- # xtrace_disable 00:03:59.803 15:43:05 setup.sh.hugepages -- common/autotest_common.sh@10 -- # set +x 00:03:59.803 ************************************ 00:03:59.803 END TEST hugepages 00:03:59.803 ************************************ 00:04:00.062 15:43:05 setup.sh -- setup/test-setup.sh@14 -- # run_test driver /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:00.062 15:43:05 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:00.062 15:43:05 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:00.062 15:43:05 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:00.062 ************************************ 00:04:00.062 START TEST driver 00:04:00.062 ************************************ 00:04:00.062 15:43:05 setup.sh.driver -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/driver.sh 00:04:00.062 * Looking for test storage... 00:04:00.062 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:00.062 15:43:05 setup.sh.driver -- setup/driver.sh@68 -- # setup reset 00:04:00.062 15:43:05 setup.sh.driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:00.062 15:43:05 setup.sh.driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:04.269 15:43:09 setup.sh.driver -- setup/driver.sh@69 -- # run_test guess_driver guess_driver 00:04:04.269 15:43:09 setup.sh.driver -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:04.269 15:43:09 setup.sh.driver -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:04.269 15:43:09 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:04.269 ************************************ 00:04:04.269 START TEST guess_driver 00:04:04.269 ************************************ 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- common/autotest_common.sh@1124 -- # guess_driver 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@46 -- # local driver setup_driver marker 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@47 -- # local fail=0 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # pick_driver 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@36 -- # vfio 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@21 -- # local iommu_grups 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@22 -- # local unsafe_vfio 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@24 -- # [[ -e /sys/module/vfio/parameters/enable_unsafe_noiommu_mode ]] 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@25 -- # unsafe_vfio=N 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@27 -- # iommu_groups=(/sys/kernel/iommu_groups/*) 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@29 -- # (( 220 > 0 )) 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # is_driver vfio_pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@14 -- # mod vfio_pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # dep vfio_pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@11 -- # modprobe --show-depends vfio_pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@12 -- # [[ insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/virt/lib/irqbypass.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/iommu/iommufd/iommufd.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/vfio_iommu_type1.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci-core.ko.xz 00:04:04.269 insmod /lib/modules/6.7.0-68.fc38.x86_64/kernel/drivers/vfio/pci/vfio-pci.ko.xz == *\.\k\o* ]] 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@30 -- # return 0 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@37 -- # echo vfio-pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@49 -- # driver=vfio-pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@51 -- # [[ vfio-pci == \N\o\ \v\a\l\i\d\ \d\r\i\v\e\r\ \f\o\u\n\d ]] 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@56 -- # echo 'Looking for driver=vfio-pci' 00:04:04.269 Looking for driver=vfio-pci 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/driver.sh@45 -- # setup output config 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ output == output ]] 00:04:04.269 15:43:09 setup.sh.driver.guess_driver -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ denied == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # continue 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:12 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.559 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.818 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:07.819 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:08.387 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@58 -- # [[ -> == \-\> ]] 00:04:08.387 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@61 -- # [[ vfio-pci == vfio-pci ]] 00:04:08.387 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@57 -- # read -r _ _ _ _ marker setup_driver 00:04:08.646 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@64 -- # (( fail == 0 )) 00:04:08.646 15:43:13 setup.sh.driver.guess_driver -- setup/driver.sh@65 -- # setup reset 00:04:08.646 15:43:13 setup.sh.driver.guess_driver -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:08.646 15:43:13 setup.sh.driver.guess_driver -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:13.920 00:04:13.920 real 0m8.938s 00:04:13.920 user 0m2.763s 00:04:13.920 sys 0m4.624s 00:04:13.920 15:43:18 setup.sh.driver.guess_driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:13.920 15:43:18 setup.sh.driver.guess_driver -- common/autotest_common.sh@10 -- # set +x 00:04:13.920 ************************************ 00:04:13.920 END TEST guess_driver 00:04:13.920 ************************************ 00:04:13.920 00:04:13.920 real 0m13.271s 00:04:13.920 user 0m3.962s 00:04:13.920 sys 0m6.926s 00:04:13.920 15:43:18 setup.sh.driver -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:13.920 15:43:18 setup.sh.driver -- common/autotest_common.sh@10 -- # set +x 00:04:13.920 ************************************ 00:04:13.920 END TEST driver 00:04:13.920 ************************************ 00:04:13.920 15:43:18 setup.sh -- setup/test-setup.sh@15 -- # run_test devices /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:13.920 15:43:18 setup.sh -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:13.920 15:43:18 setup.sh -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:13.920 15:43:18 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:13.920 ************************************ 00:04:13.920 START TEST devices 00:04:13.920 ************************************ 00:04:13.920 15:43:18 setup.sh.devices -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/devices.sh 00:04:13.920 * Looking for test storage... 00:04:13.920 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup 00:04:13.920 15:43:18 setup.sh.devices -- setup/devices.sh@190 -- # trap cleanup EXIT 00:04:13.920 15:43:18 setup.sh.devices -- setup/devices.sh@192 -- # setup reset 00:04:13.920 15:43:18 setup.sh.devices -- setup/common.sh@9 -- # [[ reset == output ]] 00:04:13.920 15:43:18 setup.sh.devices -- setup/common.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@194 -- # get_zoned_devs 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1668 -- # zoned_devs=() 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1668 -- # local -gA zoned_devs 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1669 -- # local nvme bdf 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n1 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n1 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n1/queue/zoned ]] 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme0n2 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme0n2 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme0n2/queue/zoned ]] 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ host-managed != none ]] 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1673 -- # zoned_devs["${nvme##*/}"]=0000:5f:00.0 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1671 -- # for nvme in /sys/block/nvme* 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1672 -- # is_block_zoned nvme1n1 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1661 -- # local device=nvme1n1 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1663 -- # [[ -e /sys/block/nvme1n1/queue/zoned ]] 00:04:17.211 15:43:22 setup.sh.devices -- common/autotest_common.sh@1664 -- # [[ none != none ]] 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@196 -- # blocks=() 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@196 -- # declare -a blocks 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@197 -- # blocks_to_pci=() 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@197 -- # declare -A blocks_to_pci 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@198 -- # min_disk_size=3221225472 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n1 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:17.211 15:43:22 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0n2 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme0 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5f:00.0 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\f\:\0\0\.\0* ]] 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@203 -- # continue 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@200 -- # for block in "/sys/block/nvme"!(*c*) 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@201 -- # ctrl=nvme1 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@202 -- # pci=0000:5e:00.0 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@203 -- # [[ 0000:5f:00.0 == *\0\0\0\0\:\5\e\:\0\0\.\0* ]] 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@204 -- # block_in_use nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- scripts/common.sh@378 -- # local block=nvme1n1 pt 00:04:17.212 15:43:22 setup.sh.devices -- scripts/common.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/spdk-gpt.py nvme1n1 00:04:17.212 No valid GPT data, bailing 00:04:17.212 15:43:22 setup.sh.devices -- scripts/common.sh@391 -- # blkid -s PTTYPE -o value /dev/nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- scripts/common.sh@391 -- # pt= 00:04:17.212 15:43:22 setup.sh.devices -- scripts/common.sh@392 -- # return 1 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@204 -- # sec_size_to_bytes nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- setup/common.sh@76 -- # local dev=nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- setup/common.sh@78 -- # [[ -e /sys/block/nvme1n1 ]] 00:04:17.212 15:43:22 setup.sh.devices -- setup/common.sh@80 -- # echo 1000204886016 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@204 -- # (( 1000204886016 >= min_disk_size )) 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@205 -- # blocks+=("${block##*/}") 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@206 -- # blocks_to_pci["${block##*/}"]=0000:5e:00.0 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@209 -- # (( 1 > 0 )) 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@211 -- # declare -r test_disk=nvme1n1 00:04:17.212 15:43:22 setup.sh.devices -- setup/devices.sh@213 -- # run_test nvme_mount nvme_mount 00:04:17.212 15:43:22 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:17.212 15:43:22 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:17.212 15:43:22 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:17.212 ************************************ 00:04:17.212 START TEST nvme_mount 00:04:17.212 ************************************ 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1124 -- # nvme_mount 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/devices.sh@95 -- # nvme_disk=nvme1n1 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/devices.sh@96 -- # nvme_disk_p=nvme1n1p1 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/devices.sh@97 -- # nvme_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/devices.sh@98 -- # nvme_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/devices.sh@101 -- # partition_drive nvme1n1 1 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@39 -- # local disk=nvme1n1 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@40 -- # local part_no=1 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # parts=() 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@44 -- # local parts 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme1n1 --zap-all 00:04:17.212 15:43:22 setup.sh.devices.nvme_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme1n1p1 00:04:17.780 Creating new GPT entries in memory. 00:04:17.780 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:17.780 other utilities. 00:04:17.780 15:43:23 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:17.780 15:43:23 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:17.780 15:43:23 setup.sh.devices.nvme_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:17.780 15:43:23 setup.sh.devices.nvme_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:17.780 15:43:23 setup.sh.devices.nvme_mount -- setup/common.sh@60 -- # flock /dev/nvme1n1 sgdisk /dev/nvme1n1 --new=1:2048:2099199 00:04:18.719 Creating new GPT entries in memory. 00:04:18.719 The operation has completed successfully. 00:04:18.719 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:18.719 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:18.719 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@62 -- # wait 2564939 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@102 -- # mkfs /dev/nvme1n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme1n1p1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size= 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme1n1p1 ]] 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme1n1p1 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme1n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@105 -- # verify 0000:5e:00.0 nvme1n1:nvme1n1p1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme1n1:nvme1n1p1 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:18.981 15:43:24 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:21.584 15:43:26 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.584 15:43:26 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme1n1:nvme1n1p1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\1\n\1\:\n\v\m\e\1\n\1\p\1* ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:21.844 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@110 -- # cleanup_nvme 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@21 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme1n1p1 ]] 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@25 -- # wipefs --all /dev/nvme1n1p1 00:04:22.104 /dev/nvme1n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme1n1 ]] 00:04:22.104 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme1n1 00:04:22.363 /dev/nvme1n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:22.363 /dev/nvme1n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:22.363 /dev/nvme1n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:22.363 /dev/nvme1n1: calling ioctl to re-read partition table: Success 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@113 -- # mkfs /dev/nvme1n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 1024M 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@66 -- # local dev=/dev/nvme1n1 mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount size=1024M 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@70 -- # [[ -e /dev/nvme1n1 ]] 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/nvme1n1 1024M 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@72 -- # mount /dev/nvme1n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@116 -- # verify 0000:5e:00.0 nvme1n1:nvme1n1 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=nvme1n1:nvme1n1 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@56 -- # : 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:22.363 15:43:27 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:25.654 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.654 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.654 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: mount@nvme1n1:nvme1n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\1\n\1\:\n\v\m\e\1\n\1* ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount/test_nvme 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@123 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@125 -- # verify 0000:5e:00.0 data@nvme1n1 '' '' 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@49 -- # local mounts=data@nvme1n1 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@51 -- # local test_file= 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@53 -- # local found=0 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@59 -- # local pci status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/devices.sh@47 -- # setup output config 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:25.655 15:43:30 setup.sh.devices.nvme_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ Active devices: data@nvme1n1, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\d\a\t\a\@\n\v\m\e\1\n\1* ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@63 -- # found=1 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@68 -- # return 0 00:04:28.191 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@128 -- # cleanup_nvme 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@24 -- # [[ -b /dev/nvme1n1p1 ]] 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@27 -- # [[ -b /dev/nvme1n1 ]] 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- setup/devices.sh@28 -- # wipefs --all /dev/nvme1n1 00:04:28.192 /dev/nvme1n1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:28.192 00:04:28.192 real 0m11.519s 00:04:28.192 user 0m3.227s 00:04:28.192 sys 0m5.676s 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:28.192 15:43:33 setup.sh.devices.nvme_mount -- common/autotest_common.sh@10 -- # set +x 00:04:28.192 ************************************ 00:04:28.192 END TEST nvme_mount 00:04:28.192 ************************************ 00:04:28.451 15:43:33 setup.sh.devices -- setup/devices.sh@214 -- # run_test dm_mount dm_mount 00:04:28.451 15:43:33 setup.sh.devices -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:04:28.451 15:43:33 setup.sh.devices -- common/autotest_common.sh@1106 -- # xtrace_disable 00:04:28.451 15:43:33 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:28.451 ************************************ 00:04:28.451 START TEST dm_mount 00:04:28.451 ************************************ 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- common/autotest_common.sh@1124 -- # dm_mount 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/devices.sh@144 -- # pv=nvme1n1 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/devices.sh@145 -- # pv0=nvme1n1p1 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/devices.sh@146 -- # pv1=nvme1n1p2 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/devices.sh@148 -- # partition_drive nvme1n1 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@39 -- # local disk=nvme1n1 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@40 -- # local part_no=2 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@41 -- # local size=1073741824 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@43 -- # local part part_start=0 part_end=0 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # parts=() 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@44 -- # local parts 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part = 1 )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@47 -- # parts+=("${disk}p$part") 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part++ )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@46 -- # (( part <= part_no )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@51 -- # (( size /= 512 )) 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@56 -- # sgdisk /dev/nvme1n1 --zap-all 00:04:28.451 15:43:33 setup.sh.devices.dm_mount -- setup/common.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/sync_dev_uevents.sh block/partition nvme1n1p1 nvme1n1p2 00:04:29.389 Creating new GPT entries in memory. 00:04:29.389 GPT data structures destroyed! You may now partition the disk using fdisk or 00:04:29.389 other utilities. 00:04:29.389 15:43:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part = 1 )) 00:04:29.389 15:43:34 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:29.389 15:43:34 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:29.389 15:43:34 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:29.389 15:43:34 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme1n1 sgdisk /dev/nvme1n1 --new=1:2048:2099199 00:04:30.327 Creating new GPT entries in memory. 00:04:30.327 The operation has completed successfully. 00:04:30.327 15:43:35 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:30.327 15:43:35 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:30.327 15:43:35 setup.sh.devices.dm_mount -- setup/common.sh@58 -- # (( part_start = part_start == 0 ? 2048 : part_end + 1 )) 00:04:30.327 15:43:35 setup.sh.devices.dm_mount -- setup/common.sh@59 -- # (( part_end = part_start + size - 1 )) 00:04:30.327 15:43:35 setup.sh.devices.dm_mount -- setup/common.sh@60 -- # flock /dev/nvme1n1 sgdisk /dev/nvme1n1 --new=2:2099200:4196351 00:04:31.704 The operation has completed successfully. 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part++ )) 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@57 -- # (( part <= part_no )) 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@62 -- # wait 2569638 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@150 -- # dm_name=nvme_dm_test 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@151 -- # dm_mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@152 -- # dm_dummy_test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@155 -- # dmsetup create nvme_dm_test 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@160 -- # for t in {1..5} 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@161 -- # break 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@164 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # readlink -f /dev/mapper/nvme_dm_test 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@165 -- # dm=/dev/dm-0 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@166 -- # dm=dm-0 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@168 -- # [[ -e /sys/class/block/nvme1n1p1/holders/dm-0 ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@169 -- # [[ -e /sys/class/block/nvme1n1p2/holders/dm-0 ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@171 -- # mkfs /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@66 -- # local dev=/dev/mapper/nvme_dm_test mount=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount size= 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@68 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@70 -- # [[ -e /dev/mapper/nvme_dm_test ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@71 -- # mkfs.ext4 -qF /dev/mapper/nvme_dm_test 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@72 -- # mount /dev/mapper/nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@174 -- # verify 0000:5e:00.0 nvme1n1:nvme_dm_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=nvme1n1:nvme_dm_test 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@56 -- # : 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:31.704 15:43:36 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:34.239 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.239 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme1n1p1:dm-0,holder@nvme1n1p2:dm-0,mount@nvme1n1:nvme_dm_test, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\n\v\m\e\1\n\1\:\n\v\m\e\_\d\m\_\t\e\s\t* ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@71 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@73 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@74 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount/test_dm 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@182 -- # umount /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@184 -- # verify 0000:5e:00.0 holder@nvme1n1p1:dm-0,holder@nvme1n1p2:dm-0 '' '' 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@48 -- # local dev=0000:5e:00.0 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@49 -- # local mounts=holder@nvme1n1p1:dm-0,holder@nvme1n1p2:dm-0 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@50 -- # local mount_point= 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@51 -- # local test_file= 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@53 -- # local found=0 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@55 -- # [[ -n '' ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@59 -- # local pci status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # PCI_ALLOWED=0000:5e:00.0 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/devices.sh@47 -- # setup output config 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/common.sh@9 -- # [[ output == output ]] 00:04:34.499 15:43:39 setup.sh.devices.dm_mount -- setup/common.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh config 00:04:37.789 15:43:42 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5f:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:42 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:5e:00.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ Active devices: holder@nvme1n1p1:dm-0,holder@nvme1n1p2:dm-0, so not binding PCI dev == *\A\c\t\i\v\e\ \d\e\v\i\c\e\s\:\ *\h\o\l\d\e\r\@\n\v\m\e\1\n\1\p\1\:\d\m\-\0\,\h\o\l\d\e\r\@\n\v\m\e\1\n\1\p\2\:\d\m\-\0* ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@63 -- # found=1 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:00:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.7 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.6 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.5 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.4 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.3 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.2 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.1 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@62 -- # [[ 0000:80:04.0 == \0\0\0\0\:\5\e\:\0\0\.\0 ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@60 -- # read -r pci _ _ status 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@66 -- # (( found == 1 )) 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # [[ -n '' ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@68 -- # return 0 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@187 -- # cleanup_dm 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:37.789 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@37 -- # dmsetup remove --force nvme_dm_test 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@39 -- # [[ -b /dev/nvme1n1p1 ]] 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@40 -- # wipefs --all /dev/nvme1n1p1 00:04:38.048 /dev/nvme1n1p1: 2 bytes were erased at offset 0x00000438 (ext4): 53 ef 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@42 -- # [[ -b /dev/nvme1n1p2 ]] 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- setup/devices.sh@43 -- # wipefs --all /dev/nvme1n1p2 00:04:38.048 00:04:38.048 real 0m9.556s 00:04:38.048 user 0m2.391s 00:04:38.048 sys 0m4.058s 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:38.048 15:43:43 setup.sh.devices.dm_mount -- common/autotest_common.sh@10 -- # set +x 00:04:38.048 ************************************ 00:04:38.048 END TEST dm_mount 00:04:38.048 ************************************ 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@1 -- # cleanup 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@11 -- # cleanup_nvme 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@20 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/nvme_mount 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@24 -- # [[ -b /dev/nvme1n1p1 ]] 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@25 -- # wipefs --all /dev/nvme1n1p1 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@27 -- # [[ -b /dev/nvme1n1 ]] 00:04:38.048 15:43:43 setup.sh.devices -- setup/devices.sh@28 -- # wipefs --all /dev/nvme1n1 00:04:38.307 /dev/nvme1n1: 8 bytes were erased at offset 0x00000200 (gpt): 45 46 49 20 50 41 52 54 00:04:38.307 /dev/nvme1n1: 8 bytes were erased at offset 0xe8e0db5e00 (gpt): 45 46 49 20 50 41 52 54 00:04:38.307 /dev/nvme1n1: 2 bytes were erased at offset 0x000001fe (PMBR): 55 aa 00:04:38.307 /dev/nvme1n1: calling ioctl to re-read partition table: Success 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@12 -- # cleanup_dm 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@33 -- # mountpoint -q /var/jenkins/workspace/crypto-phy-autotest/spdk/test/setup/dm_mount 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@36 -- # [[ -L /dev/mapper/nvme_dm_test ]] 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@39 -- # [[ -b /dev/nvme1n1p1 ]] 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@42 -- # [[ -b /dev/nvme1n1p2 ]] 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@14 -- # [[ -b /dev/nvme1n1 ]] 00:04:38.307 15:43:43 setup.sh.devices -- setup/devices.sh@15 -- # wipefs --all /dev/nvme1n1 00:04:38.307 00:04:38.307 real 0m24.933s 00:04:38.307 user 0m6.965s 00:04:38.307 sys 0m12.000s 00:04:38.307 15:43:43 setup.sh.devices -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:38.307 15:43:43 setup.sh.devices -- common/autotest_common.sh@10 -- # set +x 00:04:38.307 ************************************ 00:04:38.307 END TEST devices 00:04:38.307 ************************************ 00:04:38.307 00:04:38.307 real 1m26.982s 00:04:38.307 user 0m28.929s 00:04:38.307 sys 0m47.876s 00:04:38.307 15:43:43 setup.sh -- common/autotest_common.sh@1125 -- # xtrace_disable 00:04:38.307 15:43:43 setup.sh -- common/autotest_common.sh@10 -- # set +x 00:04:38.307 ************************************ 00:04:38.307 END TEST setup.sh 00:04:38.307 ************************************ 00:04:38.307 15:43:43 -- spdk/autotest.sh@128 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh status 00:04:41.641 Hugepages 00:04:41.641 node hugesize free / total 00:04:41.641 node0 1048576kB 0 / 0 00:04:41.641 node0 2048kB 1024 / 1024 00:04:41.641 node1 1048576kB 0 / 0 00:04:41.641 node1 2048kB 1024 / 1024 00:04:41.641 00:04:41.641 Type BDF Vendor Device NUMA Driver Device Block devices 00:04:41.641 I/OAT 0000:00:04.0 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.1 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.2 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.3 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.4 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.5 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.6 8086 2021 0 ioatdma - - 00:04:41.641 I/OAT 0000:00:04.7 8086 2021 0 ioatdma - - 00:04:41.641 NVMe 0000:5e:00.0 8086 0a54 0 nvme nvme1 nvme1n1 00:04:41.641 NVMe 0000:5f:00.0 1b96 2600 0 nvme nvme0 nvme0n1 nvme0n2 00:04:41.641 I/OAT 0000:80:04.0 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.1 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.2 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.3 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.4 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.5 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.6 8086 2021 1 ioatdma - - 00:04:41.641 I/OAT 0000:80:04.7 8086 2021 1 ioatdma - - 00:04:41.641 15:43:46 -- spdk/autotest.sh@130 -- # uname -s 00:04:41.641 15:43:46 -- spdk/autotest.sh@130 -- # [[ Linux == Linux ]] 00:04:41.641 15:43:46 -- spdk/autotest.sh@132 -- # nvme_namespace_revert 00:04:41.641 15:43:46 -- common/autotest_common.sh@1530 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:44.177 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:44.746 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:44.746 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:45.685 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:45.685 15:43:51 -- common/autotest_common.sh@1531 -- # sleep 1 00:04:47.065 15:43:52 -- common/autotest_common.sh@1532 -- # bdfs=() 00:04:47.065 15:43:52 -- common/autotest_common.sh@1532 -- # local bdfs 00:04:47.065 15:43:52 -- common/autotest_common.sh@1533 -- # bdfs=($(get_nvme_bdfs)) 00:04:47.065 15:43:52 -- common/autotest_common.sh@1533 -- # get_nvme_bdfs 00:04:47.065 15:43:52 -- common/autotest_common.sh@1512 -- # bdfs=() 00:04:47.065 15:43:52 -- common/autotest_common.sh@1512 -- # local bdfs 00:04:47.065 15:43:52 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:47.065 15:43:52 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:47.065 15:43:52 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:04:47.065 15:43:52 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:04:47.065 15:43:52 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:5e:00.0 00:04:47.065 15:43:52 -- common/autotest_common.sh@1535 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh reset 00:04:49.601 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:49.860 Waiting for block devices as requested 00:04:50.120 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:04:50.120 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:50.120 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:50.379 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:50.379 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:50.379 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:50.638 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:50.638 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:50.638 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:50.898 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:04:50.898 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:04:50.898 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:04:50.898 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:04:51.157 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:04:51.157 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:04:51.157 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:04:51.417 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:04:51.417 15:43:56 -- common/autotest_common.sh@1537 -- # for bdf in "${bdfs[@]}" 00:04:51.417 15:43:56 -- common/autotest_common.sh@1538 -- # get_nvme_ctrlr_from_bdf 0000:5e:00.0 00:04:51.417 15:43:56 -- common/autotest_common.sh@1501 -- # readlink -f /sys/class/nvme/nvme0 /sys/class/nvme/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1501 -- # grep 0000:5e:00.0/nvme/nvme 00:04:51.417 15:43:56 -- common/autotest_common.sh@1501 -- # bdf_sysfs_path=/sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1502 -- # [[ -z /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme1 ]] 00:04:51.417 15:43:56 -- common/autotest_common.sh@1506 -- # basename /sys/devices/pci0000:5d/0000:5d:02.0/0000:5e:00.0/nvme/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1506 -- # printf '%s\n' nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1538 -- # nvme_ctrlr=/dev/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1539 -- # [[ -z /dev/nvme1 ]] 00:04:51.417 15:43:56 -- common/autotest_common.sh@1544 -- # nvme id-ctrl /dev/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1544 -- # grep oacs 00:04:51.417 15:43:56 -- common/autotest_common.sh@1544 -- # cut -d: -f2 00:04:51.417 15:43:56 -- common/autotest_common.sh@1544 -- # oacs=' 0xf' 00:04:51.417 15:43:56 -- common/autotest_common.sh@1545 -- # oacs_ns_manage=8 00:04:51.417 15:43:56 -- common/autotest_common.sh@1547 -- # [[ 8 -ne 0 ]] 00:04:51.417 15:43:56 -- common/autotest_common.sh@1553 -- # nvme id-ctrl /dev/nvme1 00:04:51.417 15:43:56 -- common/autotest_common.sh@1553 -- # cut -d: -f2 00:04:51.417 15:43:56 -- common/autotest_common.sh@1553 -- # grep unvmcap 00:04:51.417 15:43:56 -- common/autotest_common.sh@1553 -- # unvmcap=' 0' 00:04:51.417 15:43:56 -- common/autotest_common.sh@1554 -- # [[ 0 -eq 0 ]] 00:04:51.417 15:43:56 -- common/autotest_common.sh@1556 -- # continue 00:04:51.417 15:43:56 -- spdk/autotest.sh@135 -- # timing_exit pre_cleanup 00:04:51.417 15:43:56 -- common/autotest_common.sh@729 -- # xtrace_disable 00:04:51.417 15:43:56 -- common/autotest_common.sh@10 -- # set +x 00:04:51.417 15:43:56 -- spdk/autotest.sh@138 -- # timing_enter afterboot 00:04:51.417 15:43:56 -- common/autotest_common.sh@723 -- # xtrace_disable 00:04:51.417 15:43:56 -- common/autotest_common.sh@10 -- # set +x 00:04:51.417 15:43:56 -- spdk/autotest.sh@139 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/setup.sh 00:04:53.953 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:04:54.522 0000:00:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:00:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:80:04.7 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:80:04.6 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:80:04.5 (8086 2021): ioatdma -> vfio-pci 00:04:54.522 0000:80:04.4 (8086 2021): ioatdma -> vfio-pci 00:04:54.781 0000:80:04.3 (8086 2021): ioatdma -> vfio-pci 00:04:54.781 0000:80:04.2 (8086 2021): ioatdma -> vfio-pci 00:04:54.781 0000:80:04.1 (8086 2021): ioatdma -> vfio-pci 00:04:54.781 0000:80:04.0 (8086 2021): ioatdma -> vfio-pci 00:04:55.719 0000:5e:00.0 (8086 0a54): nvme -> vfio-pci 00:04:55.719 15:44:01 -- spdk/autotest.sh@140 -- # timing_exit afterboot 00:04:55.719 15:44:01 -- common/autotest_common.sh@729 -- # xtrace_disable 00:04:55.719 15:44:01 -- common/autotest_common.sh@10 -- # set +x 00:04:55.719 15:44:01 -- spdk/autotest.sh@144 -- # opal_revert_cleanup 00:04:55.719 15:44:01 -- common/autotest_common.sh@1590 -- # mapfile -t bdfs 00:04:55.720 15:44:01 -- common/autotest_common.sh@1590 -- # get_nvme_bdfs_by_id 0x0a54 00:04:55.720 15:44:01 -- common/autotest_common.sh@1576 -- # bdfs=() 00:04:55.720 15:44:01 -- common/autotest_common.sh@1576 -- # local bdfs 00:04:55.720 15:44:01 -- common/autotest_common.sh@1578 -- # get_nvme_bdfs 00:04:55.720 15:44:01 -- common/autotest_common.sh@1512 -- # bdfs=() 00:04:55.720 15:44:01 -- common/autotest_common.sh@1512 -- # local bdfs 00:04:55.720 15:44:01 -- common/autotest_common.sh@1513 -- # bdfs=($("$rootdir/scripts/gen_nvme.sh" | jq -r '.config[].params.traddr')) 00:04:55.720 15:44:01 -- common/autotest_common.sh@1513 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:04:55.720 15:44:01 -- common/autotest_common.sh@1513 -- # jq -r '.config[].params.traddr' 00:04:55.720 15:44:01 -- common/autotest_common.sh@1514 -- # (( 1 == 0 )) 00:04:55.720 15:44:01 -- common/autotest_common.sh@1518 -- # printf '%s\n' 0000:5e:00.0 00:04:55.720 15:44:01 -- common/autotest_common.sh@1578 -- # for bdf in $(get_nvme_bdfs) 00:04:55.720 15:44:01 -- common/autotest_common.sh@1579 -- # cat /sys/bus/pci/devices/0000:5e:00.0/device 00:04:55.720 15:44:01 -- common/autotest_common.sh@1579 -- # device=0x0a54 00:04:55.720 15:44:01 -- common/autotest_common.sh@1580 -- # [[ 0x0a54 == \0\x\0\a\5\4 ]] 00:04:55.720 15:44:01 -- common/autotest_common.sh@1581 -- # bdfs+=($bdf) 00:04:55.720 15:44:01 -- common/autotest_common.sh@1585 -- # printf '%s\n' 0000:5e:00.0 00:04:55.720 15:44:01 -- common/autotest_common.sh@1591 -- # [[ -z 0000:5e:00.0 ]] 00:04:55.720 15:44:01 -- common/autotest_common.sh@1596 -- # spdk_tgt_pid=2579619 00:04:55.720 15:44:01 -- common/autotest_common.sh@1597 -- # waitforlisten 2579619 00:04:55.720 15:44:01 -- common/autotest_common.sh@1595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:04:55.720 15:44:01 -- common/autotest_common.sh@830 -- # '[' -z 2579619 ']' 00:04:55.720 15:44:01 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:04:55.720 15:44:01 -- common/autotest_common.sh@835 -- # local max_retries=100 00:04:55.720 15:44:01 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:04:55.720 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:04:55.720 15:44:01 -- common/autotest_common.sh@839 -- # xtrace_disable 00:04:55.720 15:44:01 -- common/autotest_common.sh@10 -- # set +x 00:04:55.720 [2024-06-10 15:44:01.212317] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:04:55.720 [2024-06-10 15:44:01.212380] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2579619 ] 00:04:55.978 [2024-06-10 15:44:01.306894] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:04:55.978 [2024-06-10 15:44:01.396691] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:04:56.915 15:44:02 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:04:56.915 15:44:02 -- common/autotest_common.sh@863 -- # return 0 00:04:56.915 15:44:02 -- common/autotest_common.sh@1599 -- # bdf_id=0 00:04:56.915 15:44:02 -- common/autotest_common.sh@1600 -- # for bdf in "${bdfs[@]}" 00:04:56.915 15:44:02 -- common/autotest_common.sh@1601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_attach_controller -b nvme0 -t pcie -a 0000:5e:00.0 00:05:00.205 nvme0n1 00:05:00.205 15:44:05 -- common/autotest_common.sh@1603 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_nvme_opal_revert -b nvme0 -p test 00:05:00.205 [2024-06-10 15:44:05.406138] nvme_opal.c:2063:spdk_opal_cmd_revert_tper: *ERROR*: Error on starting admin SP session with error 18 00:05:00.205 [2024-06-10 15:44:05.406181] vbdev_opal_rpc.c: 134:rpc_bdev_nvme_opal_revert: *ERROR*: Revert TPer failure: 18 00:05:00.205 request: 00:05:00.205 { 00:05:00.205 "nvme_ctrlr_name": "nvme0", 00:05:00.205 "password": "test", 00:05:00.205 "method": "bdev_nvme_opal_revert", 00:05:00.205 "req_id": 1 00:05:00.205 } 00:05:00.205 Got JSON-RPC error response 00:05:00.205 response: 00:05:00.205 { 00:05:00.205 "code": -32603, 00:05:00.205 "message": "Internal error" 00:05:00.205 } 00:05:00.205 15:44:05 -- common/autotest_common.sh@1603 -- # true 00:05:00.205 15:44:05 -- common/autotest_common.sh@1604 -- # (( ++bdf_id )) 00:05:00.205 15:44:05 -- common/autotest_common.sh@1607 -- # killprocess 2579619 00:05:00.205 15:44:05 -- common/autotest_common.sh@949 -- # '[' -z 2579619 ']' 00:05:00.205 15:44:05 -- common/autotest_common.sh@953 -- # kill -0 2579619 00:05:00.205 15:44:05 -- common/autotest_common.sh@954 -- # uname 00:05:00.205 15:44:05 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:00.205 15:44:05 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2579619 00:05:00.205 15:44:05 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:00.205 15:44:05 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:00.205 15:44:05 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2579619' 00:05:00.205 killing process with pid 2579619 00:05:00.205 15:44:05 -- common/autotest_common.sh@968 -- # kill 2579619 00:05:00.205 15:44:05 -- common/autotest_common.sh@973 -- # wait 2579619 00:05:01.613 15:44:07 -- spdk/autotest.sh@150 -- # '[' 0 -eq 1 ']' 00:05:01.613 15:44:07 -- spdk/autotest.sh@154 -- # '[' 1 -eq 1 ']' 00:05:01.613 15:44:07 -- spdk/autotest.sh@155 -- # [[ 1 -eq 1 ]] 00:05:01.613 15:44:07 -- spdk/autotest.sh@156 -- # [[ 0 -eq 1 ]] 00:05:01.613 15:44:07 -- spdk/autotest.sh@159 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/qat_setup.sh 00:05:02.180 Restarting all devices. 00:05:06.370 lstat() error: No such file or directory 00:05:06.370 QAT Error: No GENERAL section found 00:05:06.370 Failed to configure qat_dev0 00:05:06.370 lstat() error: No such file or directory 00:05:06.370 QAT Error: No GENERAL section found 00:05:06.370 Failed to configure qat_dev1 00:05:06.370 lstat() error: No such file or directory 00:05:06.370 QAT Error: No GENERAL section found 00:05:06.370 Failed to configure qat_dev2 00:05:06.370 enable sriov 00:05:06.370 Checking status of all devices. 00:05:06.370 There is 3 QAT acceleration device(s) in the system: 00:05:06.370 qat_dev0 - type: c6xx, inst_id: 0, node_id: 0, bsf: 0000:1a:00.0, #accel: 5 #engines: 10 state: down 00:05:06.370 qat_dev1 - type: c6xx, inst_id: 1, node_id: 0, bsf: 0000:1c:00.0, #accel: 5 #engines: 10 state: down 00:05:06.370 qat_dev2 - type: c6xx, inst_id: 2, node_id: 0, bsf: 0000:1e:00.0, #accel: 5 #engines: 10 state: down 00:05:06.629 0000:1a:00.0 set to 16 VFs 00:05:07.196 0000:1c:00.0 set to 16 VFs 00:05:08.242 0000:1e:00.0 set to 16 VFs 00:05:09.620 Properly configured the qat device with driver uio_pci_generic. 00:05:09.620 15:44:14 -- spdk/autotest.sh@162 -- # timing_enter lib 00:05:09.620 15:44:14 -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:09.620 15:44:14 -- common/autotest_common.sh@10 -- # set +x 00:05:09.620 15:44:14 -- spdk/autotest.sh@164 -- # [[ 0 -eq 1 ]] 00:05:09.620 15:44:14 -- spdk/autotest.sh@168 -- # run_test env /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:09.620 15:44:14 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:09.620 15:44:14 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:09.620 15:44:14 -- common/autotest_common.sh@10 -- # set +x 00:05:09.620 ************************************ 00:05:09.620 START TEST env 00:05:09.620 ************************************ 00:05:09.620 15:44:14 env -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env.sh 00:05:09.620 * Looking for test storage... 00:05:09.620 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env 00:05:09.620 15:44:14 env -- env/env.sh@10 -- # run_test env_memory /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:09.620 15:44:14 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:09.620 15:44:14 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:09.620 15:44:14 env -- common/autotest_common.sh@10 -- # set +x 00:05:09.620 ************************************ 00:05:09.620 START TEST env_memory 00:05:09.620 ************************************ 00:05:09.621 15:44:14 env.env_memory -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/memory/memory_ut 00:05:09.621 00:05:09.621 00:05:09.621 CUnit - A unit testing framework for C - Version 2.1-3 00:05:09.621 http://cunit.sourceforge.net/ 00:05:09.621 00:05:09.621 00:05:09.621 Suite: memory 00:05:09.621 Test: alloc and free memory map ...[2024-06-10 15:44:15.019667] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 283:spdk_mem_map_alloc: *ERROR*: Initial mem_map notify failed 00:05:09.621 passed 00:05:09.621 Test: mem map translation ...[2024-06-10 15:44:15.050091] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=2097152 len=1234 00:05:09.621 [2024-06-10 15:44:15.050111] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 590:spdk_mem_map_set_translation: *ERROR*: invalid spdk_mem_map_set_translation parameters, vaddr=1234 len=2097152 00:05:09.621 [2024-06-10 15:44:15.050166] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 584:spdk_mem_map_set_translation: *ERROR*: invalid usermode virtual address 281474976710656 00:05:09.621 [2024-06-10 15:44:15.050175] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 600:spdk_mem_map_set_translation: *ERROR*: could not get 0xffffffe00000 map 00:05:09.621 passed 00:05:09.621 Test: mem map registration ...[2024-06-10 15:44:15.112765] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x200000 len=1234 00:05:09.621 [2024-06-10 15:44:15.112784] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/memory.c: 346:spdk_mem_register: *ERROR*: invalid spdk_mem_register parameters, vaddr=0x4d2 len=2097152 00:05:09.881 passed 00:05:09.881 Test: mem map adjacent registrations ...passed 00:05:09.881 00:05:09.881 Run Summary: Type Total Ran Passed Failed Inactive 00:05:09.881 suites 1 1 n/a 0 0 00:05:09.881 tests 4 4 4 0 0 00:05:09.881 asserts 152 152 152 0 n/a 00:05:09.881 00:05:09.881 Elapsed time = 0.213 seconds 00:05:09.881 00:05:09.881 real 0m0.225s 00:05:09.881 user 0m0.215s 00:05:09.881 sys 0m0.010s 00:05:09.881 15:44:15 env.env_memory -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:09.881 15:44:15 env.env_memory -- common/autotest_common.sh@10 -- # set +x 00:05:09.881 ************************************ 00:05:09.881 END TEST env_memory 00:05:09.881 ************************************ 00:05:09.881 15:44:15 env -- env/env.sh@11 -- # run_test env_vtophys /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:09.881 15:44:15 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:09.881 15:44:15 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:09.881 15:44:15 env -- common/autotest_common.sh@10 -- # set +x 00:05:09.881 ************************************ 00:05:09.881 START TEST env_vtophys 00:05:09.881 ************************************ 00:05:09.881 15:44:15 env.env_vtophys -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/vtophys/vtophys 00:05:09.881 EAL: lib.eal log level changed from notice to debug 00:05:09.881 EAL: Detected lcore 0 as core 0 on socket 0 00:05:09.881 EAL: Detected lcore 1 as core 1 on socket 0 00:05:09.881 EAL: Detected lcore 2 as core 2 on socket 0 00:05:09.881 EAL: Detected lcore 3 as core 3 on socket 0 00:05:09.881 EAL: Detected lcore 4 as core 4 on socket 0 00:05:09.881 EAL: Detected lcore 5 as core 5 on socket 0 00:05:09.881 EAL: Detected lcore 6 as core 6 on socket 0 00:05:09.881 EAL: Detected lcore 7 as core 8 on socket 0 00:05:09.881 EAL: Detected lcore 8 as core 9 on socket 0 00:05:09.881 EAL: Detected lcore 9 as core 10 on socket 0 00:05:09.881 EAL: Detected lcore 10 as core 11 on socket 0 00:05:09.881 EAL: Detected lcore 11 as core 12 on socket 0 00:05:09.881 EAL: Detected lcore 12 as core 13 on socket 0 00:05:09.881 EAL: Detected lcore 13 as core 16 on socket 0 00:05:09.881 EAL: Detected lcore 14 as core 17 on socket 0 00:05:09.881 EAL: Detected lcore 15 as core 18 on socket 0 00:05:09.881 EAL: Detected lcore 16 as core 19 on socket 0 00:05:09.881 EAL: Detected lcore 17 as core 20 on socket 0 00:05:09.881 EAL: Detected lcore 18 as core 21 on socket 0 00:05:09.881 EAL: Detected lcore 19 as core 25 on socket 0 00:05:09.881 EAL: Detected lcore 20 as core 26 on socket 0 00:05:09.881 EAL: Detected lcore 21 as core 27 on socket 0 00:05:09.881 EAL: Detected lcore 22 as core 28 on socket 0 00:05:09.881 EAL: Detected lcore 23 as core 29 on socket 0 00:05:09.881 EAL: Detected lcore 24 as core 0 on socket 1 00:05:09.881 EAL: Detected lcore 25 as core 1 on socket 1 00:05:09.881 EAL: Detected lcore 26 as core 2 on socket 1 00:05:09.881 EAL: Detected lcore 27 as core 3 on socket 1 00:05:09.881 EAL: Detected lcore 28 as core 4 on socket 1 00:05:09.881 EAL: Detected lcore 29 as core 5 on socket 1 00:05:09.881 EAL: Detected lcore 30 as core 6 on socket 1 00:05:09.881 EAL: Detected lcore 31 as core 8 on socket 1 00:05:09.881 EAL: Detected lcore 32 as core 9 on socket 1 00:05:09.881 EAL: Detected lcore 33 as core 10 on socket 1 00:05:09.881 EAL: Detected lcore 34 as core 11 on socket 1 00:05:09.881 EAL: Detected lcore 35 as core 12 on socket 1 00:05:09.881 EAL: Detected lcore 36 as core 13 on socket 1 00:05:09.881 EAL: Detected lcore 37 as core 16 on socket 1 00:05:09.881 EAL: Detected lcore 38 as core 17 on socket 1 00:05:09.881 EAL: Detected lcore 39 as core 18 on socket 1 00:05:09.881 EAL: Detected lcore 40 as core 19 on socket 1 00:05:09.881 EAL: Detected lcore 41 as core 20 on socket 1 00:05:09.881 EAL: Detected lcore 42 as core 21 on socket 1 00:05:09.881 EAL: Detected lcore 43 as core 25 on socket 1 00:05:09.881 EAL: Detected lcore 44 as core 26 on socket 1 00:05:09.881 EAL: Detected lcore 45 as core 27 on socket 1 00:05:09.881 EAL: Detected lcore 46 as core 28 on socket 1 00:05:09.881 EAL: Detected lcore 47 as core 29 on socket 1 00:05:09.881 EAL: Detected lcore 48 as core 0 on socket 0 00:05:09.881 EAL: Detected lcore 49 as core 1 on socket 0 00:05:09.881 EAL: Detected lcore 50 as core 2 on socket 0 00:05:09.881 EAL: Detected lcore 51 as core 3 on socket 0 00:05:09.881 EAL: Detected lcore 52 as core 4 on socket 0 00:05:09.881 EAL: Detected lcore 53 as core 5 on socket 0 00:05:09.881 EAL: Detected lcore 54 as core 6 on socket 0 00:05:09.881 EAL: Detected lcore 55 as core 8 on socket 0 00:05:09.881 EAL: Detected lcore 56 as core 9 on socket 0 00:05:09.881 EAL: Detected lcore 57 as core 10 on socket 0 00:05:09.881 EAL: Detected lcore 58 as core 11 on socket 0 00:05:09.881 EAL: Detected lcore 59 as core 12 on socket 0 00:05:09.881 EAL: Detected lcore 60 as core 13 on socket 0 00:05:09.881 EAL: Detected lcore 61 as core 16 on socket 0 00:05:09.881 EAL: Detected lcore 62 as core 17 on socket 0 00:05:09.881 EAL: Detected lcore 63 as core 18 on socket 0 00:05:09.881 EAL: Detected lcore 64 as core 19 on socket 0 00:05:09.881 EAL: Detected lcore 65 as core 20 on socket 0 00:05:09.881 EAL: Detected lcore 66 as core 21 on socket 0 00:05:09.881 EAL: Detected lcore 67 as core 25 on socket 0 00:05:09.881 EAL: Detected lcore 68 as core 26 on socket 0 00:05:09.881 EAL: Detected lcore 69 as core 27 on socket 0 00:05:09.881 EAL: Detected lcore 70 as core 28 on socket 0 00:05:09.881 EAL: Detected lcore 71 as core 29 on socket 0 00:05:09.881 EAL: Detected lcore 72 as core 0 on socket 1 00:05:09.881 EAL: Detected lcore 73 as core 1 on socket 1 00:05:09.881 EAL: Detected lcore 74 as core 2 on socket 1 00:05:09.881 EAL: Detected lcore 75 as core 3 on socket 1 00:05:09.881 EAL: Detected lcore 76 as core 4 on socket 1 00:05:09.881 EAL: Detected lcore 77 as core 5 on socket 1 00:05:09.881 EAL: Detected lcore 78 as core 6 on socket 1 00:05:09.881 EAL: Detected lcore 79 as core 8 on socket 1 00:05:09.881 EAL: Detected lcore 80 as core 9 on socket 1 00:05:09.881 EAL: Detected lcore 81 as core 10 on socket 1 00:05:09.881 EAL: Detected lcore 82 as core 11 on socket 1 00:05:09.881 EAL: Detected lcore 83 as core 12 on socket 1 00:05:09.881 EAL: Detected lcore 84 as core 13 on socket 1 00:05:09.881 EAL: Detected lcore 85 as core 16 on socket 1 00:05:09.881 EAL: Detected lcore 86 as core 17 on socket 1 00:05:09.881 EAL: Detected lcore 87 as core 18 on socket 1 00:05:09.881 EAL: Detected lcore 88 as core 19 on socket 1 00:05:09.881 EAL: Detected lcore 89 as core 20 on socket 1 00:05:09.881 EAL: Detected lcore 90 as core 21 on socket 1 00:05:09.881 EAL: Detected lcore 91 as core 25 on socket 1 00:05:09.881 EAL: Detected lcore 92 as core 26 on socket 1 00:05:09.881 EAL: Detected lcore 93 as core 27 on socket 1 00:05:09.881 EAL: Detected lcore 94 as core 28 on socket 1 00:05:09.881 EAL: Detected lcore 95 as core 29 on socket 1 00:05:09.881 EAL: Maximum logical cores by configuration: 128 00:05:09.881 EAL: Detected CPU lcores: 96 00:05:09.881 EAL: Detected NUMA nodes: 2 00:05:09.881 EAL: Checking presence of .so 'librte_eal.so.24.1' 00:05:09.881 EAL: Detected shared linkage of DPDK 00:05:09.881 EAL: No shared files mode enabled, IPC will be disabled 00:05:09.881 EAL: No shared files mode enabled, IPC is disabled 00:05:09.881 EAL: PCI driver qat for device 0000:1a:01.0 wants IOVA as 'PA' 00:05:09.881 EAL: PCI driver qat for device 0000:1a:01.1 wants IOVA as 'PA' 00:05:09.881 EAL: PCI driver qat for device 0000:1a:01.2 wants IOVA as 'PA' 00:05:09.881 EAL: PCI driver qat for device 0000:1a:01.3 wants IOVA as 'PA' 00:05:09.881 EAL: PCI driver qat for device 0000:1a:01.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:01.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:01.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:01.7 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.0 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.1 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.2 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.3 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1a:02.7 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.0 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.1 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.2 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.3 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:01.7 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.0 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.1 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.2 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.3 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1c:02.7 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.0 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.1 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.2 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.3 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:01.7 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.0 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.1 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.2 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.3 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.4 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.5 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.6 wants IOVA as 'PA' 00:05:09.882 EAL: PCI driver qat for device 0000:1e:02.7 wants IOVA as 'PA' 00:05:09.882 EAL: Bus pci wants IOVA as 'PA' 00:05:09.882 EAL: Bus auxiliary wants IOVA as 'DC' 00:05:09.882 EAL: Bus vdev wants IOVA as 'DC' 00:05:09.882 EAL: Selected IOVA mode 'PA' 00:05:09.882 EAL: Probing VFIO support... 00:05:09.882 EAL: IOMMU type 1 (Type 1) is supported 00:05:09.882 EAL: IOMMU type 7 (sPAPR) is not supported 00:05:09.882 EAL: IOMMU type 8 (No-IOMMU) is not supported 00:05:09.882 EAL: VFIO support initialized 00:05:09.882 EAL: Ask a virtual area of 0x2e000 bytes 00:05:09.882 EAL: Virtual area found at 0x200000000000 (size = 0x2e000) 00:05:09.882 EAL: Setting up physically contiguous memory... 00:05:09.882 EAL: Setting maximum number of open files to 524288 00:05:09.882 EAL: Detected memory type: socket_id:0 hugepage_sz:2097152 00:05:09.882 EAL: Detected memory type: socket_id:1 hugepage_sz:2097152 00:05:09.882 EAL: Creating 4 segment lists: n_segs:8192 socket_id:0 hugepage_sz:2097152 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x20000002e000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x200000200000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x200000200000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x200400200000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x200400400000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x200400400000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x200800400000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x200800600000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x200800600000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x200c00600000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 0, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x200c00800000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x200c00800000, size 400000000 00:05:09.882 EAL: Creating 4 segment lists: n_segs:8192 socket_id:1 hugepage_sz:2097152 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x201000800000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x201000a00000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x201000a00000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x201400a00000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x201400c00000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x201400c00000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x201800c00000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x201800e00000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x201800e00000, size 400000000 00:05:09.882 EAL: Ask a virtual area of 0x61000 bytes 00:05:09.882 EAL: Virtual area found at 0x201c00e00000 (size = 0x61000) 00:05:09.882 EAL: Memseg list allocated at socket 1, page size 0x800kB 00:05:09.882 EAL: Ask a virtual area of 0x400000000 bytes 00:05:09.882 EAL: Virtual area found at 0x201c01000000 (size = 0x400000000) 00:05:09.882 EAL: VA reserved for memseg list at 0x201c01000000, size 400000000 00:05:09.882 EAL: Hugepages will be freed exactly as allocated. 00:05:09.882 EAL: No shared files mode enabled, IPC is disabled 00:05:09.882 EAL: No shared files mode enabled, IPC is disabled 00:05:09.882 EAL: TSC frequency is ~2100000 KHz 00:05:09.882 EAL: Main lcore 0 is ready (tid=7f6bbb3e6b00;cpuset=[0]) 00:05:09.882 EAL: Trying to obtain current memory policy. 00:05:09.882 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:09.882 EAL: Restoring previous memory policy: 0 00:05:09.882 EAL: request: mp_malloc_sync 00:05:09.882 EAL: No shared files mode enabled, IPC is disabled 00:05:09.882 EAL: Heap on socket 0 was expanded by 2MB 00:05:09.882 EAL: PCI device 0000:1a:01.0 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001000000 00:05:09.882 EAL: PCI memory mapped at 0x202001001000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.1 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001002000 00:05:09.882 EAL: PCI memory mapped at 0x202001003000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.2 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001004000 00:05:09.882 EAL: PCI memory mapped at 0x202001005000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.3 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001006000 00:05:09.882 EAL: PCI memory mapped at 0x202001007000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.4 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001008000 00:05:09.882 EAL: PCI memory mapped at 0x202001009000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.5 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x20200100a000 00:05:09.882 EAL: PCI memory mapped at 0x20200100b000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.6 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x20200100c000 00:05:09.882 EAL: PCI memory mapped at 0x20200100d000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:01.7 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x20200100e000 00:05:09.882 EAL: PCI memory mapped at 0x20200100f000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:02.0 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001010000 00:05:09.882 EAL: PCI memory mapped at 0x202001011000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:02.1 on NUMA socket 0 00:05:09.882 EAL: probe driver: 8086:37c9 qat 00:05:09.882 EAL: PCI memory mapped at 0x202001012000 00:05:09.882 EAL: PCI memory mapped at 0x202001013000 00:05:09.882 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:09.882 EAL: PCI device 0000:1a:02.2 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001014000 00:05:09.883 EAL: PCI memory mapped at 0x202001015000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:09.883 EAL: PCI device 0000:1a:02.3 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001016000 00:05:09.883 EAL: PCI memory mapped at 0x202001017000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:09.883 EAL: PCI device 0000:1a:02.4 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001018000 00:05:09.883 EAL: PCI memory mapped at 0x202001019000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:09.883 EAL: PCI device 0000:1a:02.5 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200101a000 00:05:09.883 EAL: PCI memory mapped at 0x20200101b000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:09.883 EAL: PCI device 0000:1a:02.6 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200101c000 00:05:09.883 EAL: PCI memory mapped at 0x20200101d000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:09.883 EAL: PCI device 0000:1a:02.7 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200101e000 00:05:09.883 EAL: PCI memory mapped at 0x20200101f000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.0 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001020000 00:05:09.883 EAL: PCI memory mapped at 0x202001021000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.1 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001022000 00:05:09.883 EAL: PCI memory mapped at 0x202001023000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.2 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001024000 00:05:09.883 EAL: PCI memory mapped at 0x202001025000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.3 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001026000 00:05:09.883 EAL: PCI memory mapped at 0x202001027000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.4 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001028000 00:05:09.883 EAL: PCI memory mapped at 0x202001029000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.5 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200102a000 00:05:09.883 EAL: PCI memory mapped at 0x20200102b000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.6 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200102c000 00:05:09.883 EAL: PCI memory mapped at 0x20200102d000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:01.7 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200102e000 00:05:09.883 EAL: PCI memory mapped at 0x20200102f000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.0 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001030000 00:05:09.883 EAL: PCI memory mapped at 0x202001031000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.1 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001032000 00:05:09.883 EAL: PCI memory mapped at 0x202001033000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.2 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001034000 00:05:09.883 EAL: PCI memory mapped at 0x202001035000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.3 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001036000 00:05:09.883 EAL: PCI memory mapped at 0x202001037000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.4 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001038000 00:05:09.883 EAL: PCI memory mapped at 0x202001039000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.5 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200103a000 00:05:09.883 EAL: PCI memory mapped at 0x20200103b000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.6 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200103c000 00:05:09.883 EAL: PCI memory mapped at 0x20200103d000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:09.883 EAL: PCI device 0000:1c:02.7 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200103e000 00:05:09.883 EAL: PCI memory mapped at 0x20200103f000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.0 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001040000 00:05:09.883 EAL: PCI memory mapped at 0x202001041000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.1 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001042000 00:05:09.883 EAL: PCI memory mapped at 0x202001043000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.2 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001044000 00:05:09.883 EAL: PCI memory mapped at 0x202001045000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.3 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001046000 00:05:09.883 EAL: PCI memory mapped at 0x202001047000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.4 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001048000 00:05:09.883 EAL: PCI memory mapped at 0x202001049000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.5 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200104a000 00:05:09.883 EAL: PCI memory mapped at 0x20200104b000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.6 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200104c000 00:05:09.883 EAL: PCI memory mapped at 0x20200104d000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:01.7 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x20200104e000 00:05:09.883 EAL: PCI memory mapped at 0x20200104f000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:02.0 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001050000 00:05:09.883 EAL: PCI memory mapped at 0x202001051000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:02.1 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001052000 00:05:09.883 EAL: PCI memory mapped at 0x202001053000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:02.2 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.883 EAL: PCI memory mapped at 0x202001054000 00:05:09.883 EAL: PCI memory mapped at 0x202001055000 00:05:09.883 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:09.883 EAL: PCI device 0000:1e:02.3 on NUMA socket 0 00:05:09.883 EAL: probe driver: 8086:37c9 qat 00:05:09.884 EAL: PCI memory mapped at 0x202001056000 00:05:09.884 EAL: PCI memory mapped at 0x202001057000 00:05:09.884 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:09.884 EAL: PCI device 0000:1e:02.4 on NUMA socket 0 00:05:09.884 EAL: probe driver: 8086:37c9 qat 00:05:09.884 EAL: PCI memory mapped at 0x202001058000 00:05:09.884 EAL: PCI memory mapped at 0x202001059000 00:05:09.884 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:09.884 EAL: PCI device 0000:1e:02.5 on NUMA socket 0 00:05:09.884 EAL: probe driver: 8086:37c9 qat 00:05:09.884 EAL: PCI memory mapped at 0x20200105a000 00:05:09.884 EAL: PCI memory mapped at 0x20200105b000 00:05:09.884 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:09.884 EAL: PCI device 0000:1e:02.6 on NUMA socket 0 00:05:09.884 EAL: probe driver: 8086:37c9 qat 00:05:09.884 EAL: PCI memory mapped at 0x20200105c000 00:05:09.884 EAL: PCI memory mapped at 0x20200105d000 00:05:09.884 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:09.884 EAL: PCI device 0000:1e:02.7 on NUMA socket 0 00:05:09.884 EAL: probe driver: 8086:37c9 qat 00:05:09.884 EAL: PCI memory mapped at 0x20200105e000 00:05:09.884 EAL: PCI memory mapped at 0x20200105f000 00:05:09.884 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:09.884 EAL: No shared files mode enabled, IPC is disabled 00:05:09.884 EAL: No shared files mode enabled, IPC is disabled 00:05:09.884 EAL: No PCI address specified using 'addr=' in: bus=pci 00:05:09.884 EAL: Mem event callback 'spdk:(nil)' registered 00:05:10.143 00:05:10.143 00:05:10.143 CUnit - A unit testing framework for C - Version 2.1-3 00:05:10.143 http://cunit.sourceforge.net/ 00:05:10.143 00:05:10.143 00:05:10.143 Suite: components_suite 00:05:10.143 Test: vtophys_malloc_test ...passed 00:05:10.143 Test: vtophys_spdk_malloc_test ...EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 4MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was shrunk by 4MB 00:05:10.143 EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 6MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was shrunk by 6MB 00:05:10.143 EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 10MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was shrunk by 10MB 00:05:10.143 EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 18MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was shrunk by 18MB 00:05:10.143 EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 34MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was shrunk by 34MB 00:05:10.143 EAL: Trying to obtain current memory policy. 00:05:10.143 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.143 EAL: Restoring previous memory policy: 4 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.143 EAL: Heap on socket 0 was expanded by 66MB 00:05:10.143 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.143 EAL: request: mp_malloc_sync 00:05:10.143 EAL: No shared files mode enabled, IPC is disabled 00:05:10.144 EAL: Heap on socket 0 was shrunk by 66MB 00:05:10.144 EAL: Trying to obtain current memory policy. 00:05:10.144 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.144 EAL: Restoring previous memory policy: 4 00:05:10.144 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.144 EAL: request: mp_malloc_sync 00:05:10.144 EAL: No shared files mode enabled, IPC is disabled 00:05:10.144 EAL: Heap on socket 0 was expanded by 130MB 00:05:10.144 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.144 EAL: request: mp_malloc_sync 00:05:10.144 EAL: No shared files mode enabled, IPC is disabled 00:05:10.144 EAL: Heap on socket 0 was shrunk by 130MB 00:05:10.144 EAL: Trying to obtain current memory policy. 00:05:10.144 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.144 EAL: Restoring previous memory policy: 4 00:05:10.144 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.144 EAL: request: mp_malloc_sync 00:05:10.144 EAL: No shared files mode enabled, IPC is disabled 00:05:10.144 EAL: Heap on socket 0 was expanded by 258MB 00:05:10.144 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.144 EAL: request: mp_malloc_sync 00:05:10.144 EAL: No shared files mode enabled, IPC is disabled 00:05:10.144 EAL: Heap on socket 0 was shrunk by 258MB 00:05:10.144 EAL: Trying to obtain current memory policy. 00:05:10.144 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.403 EAL: Restoring previous memory policy: 4 00:05:10.403 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.403 EAL: request: mp_malloc_sync 00:05:10.403 EAL: No shared files mode enabled, IPC is disabled 00:05:10.403 EAL: Heap on socket 0 was expanded by 514MB 00:05:10.403 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.403 EAL: request: mp_malloc_sync 00:05:10.403 EAL: No shared files mode enabled, IPC is disabled 00:05:10.403 EAL: Heap on socket 0 was shrunk by 514MB 00:05:10.403 EAL: Trying to obtain current memory policy. 00:05:10.403 EAL: Setting policy MPOL_PREFERRED for socket 0 00:05:10.662 EAL: Restoring previous memory policy: 4 00:05:10.662 EAL: Calling mem event callback 'spdk:(nil)' 00:05:10.662 EAL: request: mp_malloc_sync 00:05:10.662 EAL: No shared files mode enabled, IPC is disabled 00:05:10.662 EAL: Heap on socket 0 was expanded by 1026MB 00:05:10.920 EAL: Calling mem event callback 'spdk:(nil)' 00:05:11.180 EAL: request: mp_malloc_sync 00:05:11.180 EAL: No shared files mode enabled, IPC is disabled 00:05:11.180 EAL: Heap on socket 0 was shrunk by 1026MB 00:05:11.180 passed 00:05:11.180 00:05:11.180 Run Summary: Type Total Ran Passed Failed Inactive 00:05:11.180 suites 1 1 n/a 0 0 00:05:11.180 tests 2 2 2 0 0 00:05:11.180 asserts 6569 6569 6569 0 n/a 00:05:11.180 00:05:11.180 Elapsed time = 1.032 seconds 00:05:11.180 EAL: No shared files mode enabled, IPC is disabled 00:05:11.180 EAL: No shared files mode enabled, IPC is disabled 00:05:11.180 EAL: No shared files mode enabled, IPC is disabled 00:05:11.180 00:05:11.180 real 0m1.198s 00:05:11.180 user 0m0.686s 00:05:11.180 sys 0m0.480s 00:05:11.180 15:44:16 env.env_vtophys -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:11.180 15:44:16 env.env_vtophys -- common/autotest_common.sh@10 -- # set +x 00:05:11.180 ************************************ 00:05:11.180 END TEST env_vtophys 00:05:11.180 ************************************ 00:05:11.180 15:44:16 env -- env/env.sh@12 -- # run_test env_pci /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:11.180 15:44:16 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:11.180 15:44:16 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:11.180 15:44:16 env -- common/autotest_common.sh@10 -- # set +x 00:05:11.180 ************************************ 00:05:11.180 START TEST env_pci 00:05:11.180 ************************************ 00:05:11.180 15:44:16 env.env_pci -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/pci/pci_ut 00:05:11.180 00:05:11.180 00:05:11.180 CUnit - A unit testing framework for C - Version 2.1-3 00:05:11.180 http://cunit.sourceforge.net/ 00:05:11.180 00:05:11.180 00:05:11.180 Suite: pci 00:05:11.180 Test: pci_hook ...[2024-06-10 15:44:16.536189] /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk/pci.c:1040:spdk_pci_device_claim: *ERROR*: Cannot create lock on device /var/tmp/spdk_pci_lock_10000:00:01.0, probably process 2582386 has claimed it 00:05:11.180 EAL: Cannot find device (10000:00:01.0) 00:05:11.180 EAL: Failed to attach device on primary process 00:05:11.180 passed 00:05:11.180 00:05:11.180 Run Summary: Type Total Ran Passed Failed Inactive 00:05:11.180 suites 1 1 n/a 0 0 00:05:11.180 tests 1 1 1 0 0 00:05:11.180 asserts 25 25 25 0 n/a 00:05:11.180 00:05:11.180 Elapsed time = 0.019 seconds 00:05:11.180 00:05:11.180 real 0m0.032s 00:05:11.180 user 0m0.010s 00:05:11.180 sys 0m0.022s 00:05:11.180 15:44:16 env.env_pci -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:11.180 15:44:16 env.env_pci -- common/autotest_common.sh@10 -- # set +x 00:05:11.180 ************************************ 00:05:11.180 END TEST env_pci 00:05:11.180 ************************************ 00:05:11.180 15:44:16 env -- env/env.sh@14 -- # argv='-c 0x1 ' 00:05:11.180 15:44:16 env -- env/env.sh@15 -- # uname 00:05:11.180 15:44:16 env -- env/env.sh@15 -- # '[' Linux = Linux ']' 00:05:11.180 15:44:16 env -- env/env.sh@22 -- # argv+=--base-virtaddr=0x200000000000 00:05:11.180 15:44:16 env -- env/env.sh@24 -- # run_test env_dpdk_post_init /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:11.180 15:44:16 env -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:05:11.180 15:44:16 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:11.180 15:44:16 env -- common/autotest_common.sh@10 -- # set +x 00:05:11.180 ************************************ 00:05:11.180 START TEST env_dpdk_post_init 00:05:11.180 ************************************ 00:05:11.180 15:44:16 env.env_dpdk_post_init -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/env_dpdk_post_init/env_dpdk_post_init -c 0x1 --base-virtaddr=0x200000000000 00:05:11.180 EAL: Detected CPU lcores: 96 00:05:11.180 EAL: Detected NUMA nodes: 2 00:05:11.180 EAL: Detected shared linkage of DPDK 00:05:11.180 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:11.180 EAL: Selected IOVA mode 'PA' 00:05:11.180 EAL: VFIO support initialized 00:05:11.440 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:11.440 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:11.440 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.440 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:11.440 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.440 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:11.440 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:11.440 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.441 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.441 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:11.441 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:11.442 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:11.442 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:11.442 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:11.442 EAL: Using IOMMU type 1 (Type 1) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.0 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.1 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.2 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.3 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.4 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.5 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.6 (socket 0) 00:05:11.442 EAL: Ignore mapping IO port bar(1) 00:05:11.442 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:00:04.7 (socket 0) 00:05:12.381 EAL: Probe PCI driver: spdk_nvme (8086:0a54) device: 0000:5e:00.0 (socket 0) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.0 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.1 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.2 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.3 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.4 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.5 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.6 (socket 1) 00:05:12.381 EAL: Ignore mapping IO port bar(1) 00:05:12.381 EAL: Probe PCI driver: spdk_ioat (8086:2021) device: 0000:80:04.7 (socket 1) 00:05:15.669 EAL: Releasing PCI mapped resource for 0000:5e:00.0 00:05:15.669 EAL: Calling pci_unmap_resource for 0000:5e:00.0 at 0x202001080000 00:05:15.669 Starting DPDK initialization... 00:05:15.669 Starting SPDK post initialization... 00:05:15.669 SPDK NVMe probe 00:05:15.669 Attaching to 0000:5e:00.0 00:05:15.669 Attached to 0000:5e:00.0 00:05:15.669 Cleaning up... 00:05:15.669 00:05:15.669 real 0m4.370s 00:05:15.669 user 0m3.288s 00:05:15.669 sys 0m0.154s 00:05:15.669 15:44:21 env.env_dpdk_post_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:15.669 15:44:21 env.env_dpdk_post_init -- common/autotest_common.sh@10 -- # set +x 00:05:15.669 ************************************ 00:05:15.669 END TEST env_dpdk_post_init 00:05:15.669 ************************************ 00:05:15.669 15:44:21 env -- env/env.sh@26 -- # uname 00:05:15.669 15:44:21 env -- env/env.sh@26 -- # '[' Linux = Linux ']' 00:05:15.669 15:44:21 env -- env/env.sh@29 -- # run_test env_mem_callbacks /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:15.669 15:44:21 env -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:15.669 15:44:21 env -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:15.669 15:44:21 env -- common/autotest_common.sh@10 -- # set +x 00:05:15.669 ************************************ 00:05:15.669 START TEST env_mem_callbacks 00:05:15.669 ************************************ 00:05:15.669 15:44:21 env.env_mem_callbacks -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/env/mem_callbacks/mem_callbacks 00:05:15.669 EAL: Detected CPU lcores: 96 00:05:15.669 EAL: Detected NUMA nodes: 2 00:05:15.669 EAL: Detected shared linkage of DPDK 00:05:15.669 EAL: Multi-process socket /var/run/dpdk/rte/mp_socket 00:05:15.669 EAL: Selected IOVA mode 'PA' 00:05:15.669 EAL: VFIO support initialized 00:05:15.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.0 (socket 0) 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_asym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.0_qat_sym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.1 (socket 0) 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_asym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.1_qat_sym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.2 (socket 0) 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_asym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.2_qat_sym 00:05:15.669 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.669 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.3 (socket 0) 00:05:15.669 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.3_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.4 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.4_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.5 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.5_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.6 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.6_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:01.7 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:01.7_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.0 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.0_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.1 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.1_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.2 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.2_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.3 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.3_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.4 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.4_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.5 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.5_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.6 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.6_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1a:02.7 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1a:02.7_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1a:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.0 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.0_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.1 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.1_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.2 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.2_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.3 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.3_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.4 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.4_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.5 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.5_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.6 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.6_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:01.7 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:01.7_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.0 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.0_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.1 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.1_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.2 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.2_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.3 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.3_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.4 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.4_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.5 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.5_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.6 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_asym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.6_qat_sym 00:05:15.670 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.670 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1c:02.7 (socket 0) 00:05:15.670 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1c:02.7_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1c:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.0 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.0_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.1 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.1_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.2 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.2_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.3 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.3_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.4 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.4_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.5 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.5_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.6 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.6_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:01.7 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:01.7_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:01.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.0 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.0_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.0_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.1 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.1_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.1_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.2 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.2_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.2_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.3 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.3_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.3_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.4 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.4_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.4_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.5 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.5_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.5_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.6 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.6_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.6_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 EAL: Probe PCI driver: qat (8086:37c9) device: 0000:1e:02.7 (socket 0) 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_asym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_asym,socket id: 0, max queue pairs: 0 00:05:15.671 CRYPTODEV: Creating cryptodev 0000:1e:02.7_qat_sym 00:05:15.671 CRYPTODEV: Initialisation parameters - name: 0000:1e:02.7_qat_sym,socket id: 0, max queue pairs: 0 00:05:15.671 TELEMETRY: No legacy callbacks, legacy socket not created 00:05:15.671 00:05:15.671 00:05:15.671 CUnit - A unit testing framework for C - Version 2.1-3 00:05:15.671 http://cunit.sourceforge.net/ 00:05:15.671 00:05:15.671 00:05:15.671 Suite: memory 00:05:15.671 Test: test ... 00:05:15.671 register 0x200000200000 2097152 00:05:15.671 malloc 3145728 00:05:15.671 register 0x200000400000 4194304 00:05:15.671 buf 0x200000500000 len 3145728 PASSED 00:05:15.671 malloc 64 00:05:15.671 buf 0x2000004fff40 len 64 PASSED 00:05:15.671 malloc 4194304 00:05:15.671 register 0x200000800000 6291456 00:05:15.671 buf 0x200000a00000 len 4194304 PASSED 00:05:15.671 free 0x200000500000 3145728 00:05:15.671 free 0x2000004fff40 64 00:05:15.671 unregister 0x200000400000 4194304 PASSED 00:05:15.671 free 0x200000a00000 4194304 00:05:15.671 unregister 0x200000800000 6291456 PASSED 00:05:15.671 malloc 8388608 00:05:15.671 register 0x200000400000 10485760 00:05:15.671 buf 0x200000600000 len 8388608 PASSED 00:05:15.671 free 0x200000600000 8388608 00:05:15.671 unregister 0x200000400000 10485760 PASSED 00:05:15.671 passed 00:05:15.671 00:05:15.671 Run Summary: Type Total Ran Passed Failed Inactive 00:05:15.671 suites 1 1 n/a 0 0 00:05:15.671 tests 1 1 1 0 0 00:05:15.671 asserts 15 15 15 0 n/a 00:05:15.671 00:05:15.671 Elapsed time = 0.006 seconds 00:05:15.671 00:05:15.671 real 0m0.087s 00:05:15.671 user 0m0.032s 00:05:15.671 sys 0m0.055s 00:05:15.671 15:44:21 env.env_mem_callbacks -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:15.671 15:44:21 env.env_mem_callbacks -- common/autotest_common.sh@10 -- # set +x 00:05:15.671 ************************************ 00:05:15.671 END TEST env_mem_callbacks 00:05:15.671 ************************************ 00:05:15.931 00:05:15.931 real 0m6.349s 00:05:15.931 user 0m4.390s 00:05:15.931 sys 0m1.029s 00:05:15.931 15:44:21 env -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:15.931 15:44:21 env -- common/autotest_common.sh@10 -- # set +x 00:05:15.931 ************************************ 00:05:15.931 END TEST env 00:05:15.931 ************************************ 00:05:15.931 15:44:21 -- spdk/autotest.sh@169 -- # run_test rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:15.931 15:44:21 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:15.931 15:44:21 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:15.931 15:44:21 -- common/autotest_common.sh@10 -- # set +x 00:05:15.931 ************************************ 00:05:15.931 START TEST rpc 00:05:15.931 ************************************ 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/rpc.sh 00:05:15.931 * Looking for test storage... 00:05:15.931 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:15.931 15:44:21 rpc -- rpc/rpc.sh@65 -- # spdk_pid=2583201 00:05:15.931 15:44:21 rpc -- rpc/rpc.sh@66 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:15.931 15:44:21 rpc -- rpc/rpc.sh@64 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -e bdev 00:05:15.931 15:44:21 rpc -- rpc/rpc.sh@67 -- # waitforlisten 2583201 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@830 -- # '[' -z 2583201 ']' 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:15.931 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:15.931 15:44:21 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:15.931 [2024-06-10 15:44:21.416774] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:15.931 [2024-06-10 15:44:21.416830] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2583201 ] 00:05:16.191 [2024-06-10 15:44:21.513418] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:16.191 [2024-06-10 15:44:21.611869] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask bdev specified. 00:05:16.191 [2024-06-10 15:44:21.611911] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s spdk_tgt -p 2583201' to capture a snapshot of events at runtime. 00:05:16.191 [2024-06-10 15:44:21.611923] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:05:16.191 [2024-06-10 15:44:21.611933] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:05:16.191 [2024-06-10 15:44:21.611940] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/spdk_tgt_trace.pid2583201 for offline analysis/debug. 00:05:16.191 [2024-06-10 15:44:21.611968] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:17.130 15:44:22 rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:17.130 15:44:22 rpc -- common/autotest_common.sh@863 -- # return 0 00:05:17.130 15:44:22 rpc -- rpc/rpc.sh@69 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:17.130 15:44:22 rpc -- rpc/rpc.sh@69 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:17.130 15:44:22 rpc -- rpc/rpc.sh@72 -- # rpc=rpc_cmd 00:05:17.131 15:44:22 rpc -- rpc/rpc.sh@73 -- # run_test rpc_integrity rpc_integrity 00:05:17.131 15:44:22 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:17.131 15:44:22 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:17.131 15:44:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 ************************************ 00:05:17.131 START TEST rpc_integrity 00:05:17.131 ************************************ 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc0 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:17.131 { 00:05:17.131 "name": "Malloc0", 00:05:17.131 "aliases": [ 00:05:17.131 "d5040a5a-deb6-4aae-9e8d-5c5e5d8a2fc7" 00:05:17.131 ], 00:05:17.131 "product_name": "Malloc disk", 00:05:17.131 "block_size": 512, 00:05:17.131 "num_blocks": 16384, 00:05:17.131 "uuid": "d5040a5a-deb6-4aae-9e8d-5c5e5d8a2fc7", 00:05:17.131 "assigned_rate_limits": { 00:05:17.131 "rw_ios_per_sec": 0, 00:05:17.131 "rw_mbytes_per_sec": 0, 00:05:17.131 "r_mbytes_per_sec": 0, 00:05:17.131 "w_mbytes_per_sec": 0 00:05:17.131 }, 00:05:17.131 "claimed": false, 00:05:17.131 "zoned": false, 00:05:17.131 "supported_io_types": { 00:05:17.131 "read": true, 00:05:17.131 "write": true, 00:05:17.131 "unmap": true, 00:05:17.131 "write_zeroes": true, 00:05:17.131 "flush": true, 00:05:17.131 "reset": true, 00:05:17.131 "compare": false, 00:05:17.131 "compare_and_write": false, 00:05:17.131 "abort": true, 00:05:17.131 "nvme_admin": false, 00:05:17.131 "nvme_io": false 00:05:17.131 }, 00:05:17.131 "memory_domains": [ 00:05:17.131 { 00:05:17.131 "dma_device_id": "system", 00:05:17.131 "dma_device_type": 1 00:05:17.131 }, 00:05:17.131 { 00:05:17.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.131 "dma_device_type": 2 00:05:17.131 } 00:05:17.131 ], 00:05:17.131 "driver_specific": {} 00:05:17.131 } 00:05:17.131 ]' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc0 -p Passthru0 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 [2024-06-10 15:44:22.530817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc0 00:05:17.131 [2024-06-10 15:44:22.530855] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:17.131 [2024-06-10 15:44:22.530871] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16cac50 00:05:17.131 [2024-06-10 15:44:22.530881] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:17.131 [2024-06-10 15:44:22.532483] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:17.131 [2024-06-10 15:44:22.532514] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:17.131 Passthru0 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:17.131 { 00:05:17.131 "name": "Malloc0", 00:05:17.131 "aliases": [ 00:05:17.131 "d5040a5a-deb6-4aae-9e8d-5c5e5d8a2fc7" 00:05:17.131 ], 00:05:17.131 "product_name": "Malloc disk", 00:05:17.131 "block_size": 512, 00:05:17.131 "num_blocks": 16384, 00:05:17.131 "uuid": "d5040a5a-deb6-4aae-9e8d-5c5e5d8a2fc7", 00:05:17.131 "assigned_rate_limits": { 00:05:17.131 "rw_ios_per_sec": 0, 00:05:17.131 "rw_mbytes_per_sec": 0, 00:05:17.131 "r_mbytes_per_sec": 0, 00:05:17.131 "w_mbytes_per_sec": 0 00:05:17.131 }, 00:05:17.131 "claimed": true, 00:05:17.131 "claim_type": "exclusive_write", 00:05:17.131 "zoned": false, 00:05:17.131 "supported_io_types": { 00:05:17.131 "read": true, 00:05:17.131 "write": true, 00:05:17.131 "unmap": true, 00:05:17.131 "write_zeroes": true, 00:05:17.131 "flush": true, 00:05:17.131 "reset": true, 00:05:17.131 "compare": false, 00:05:17.131 "compare_and_write": false, 00:05:17.131 "abort": true, 00:05:17.131 "nvme_admin": false, 00:05:17.131 "nvme_io": false 00:05:17.131 }, 00:05:17.131 "memory_domains": [ 00:05:17.131 { 00:05:17.131 "dma_device_id": "system", 00:05:17.131 "dma_device_type": 1 00:05:17.131 }, 00:05:17.131 { 00:05:17.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.131 "dma_device_type": 2 00:05:17.131 } 00:05:17.131 ], 00:05:17.131 "driver_specific": {} 00:05:17.131 }, 00:05:17.131 { 00:05:17.131 "name": "Passthru0", 00:05:17.131 "aliases": [ 00:05:17.131 "6bcae02a-a7c2-5984-91c0-4b7fb0793256" 00:05:17.131 ], 00:05:17.131 "product_name": "passthru", 00:05:17.131 "block_size": 512, 00:05:17.131 "num_blocks": 16384, 00:05:17.131 "uuid": "6bcae02a-a7c2-5984-91c0-4b7fb0793256", 00:05:17.131 "assigned_rate_limits": { 00:05:17.131 "rw_ios_per_sec": 0, 00:05:17.131 "rw_mbytes_per_sec": 0, 00:05:17.131 "r_mbytes_per_sec": 0, 00:05:17.131 "w_mbytes_per_sec": 0 00:05:17.131 }, 00:05:17.131 "claimed": false, 00:05:17.131 "zoned": false, 00:05:17.131 "supported_io_types": { 00:05:17.131 "read": true, 00:05:17.131 "write": true, 00:05:17.131 "unmap": true, 00:05:17.131 "write_zeroes": true, 00:05:17.131 "flush": true, 00:05:17.131 "reset": true, 00:05:17.131 "compare": false, 00:05:17.131 "compare_and_write": false, 00:05:17.131 "abort": true, 00:05:17.131 "nvme_admin": false, 00:05:17.131 "nvme_io": false 00:05:17.131 }, 00:05:17.131 "memory_domains": [ 00:05:17.131 { 00:05:17.131 "dma_device_id": "system", 00:05:17.131 "dma_device_type": 1 00:05:17.131 }, 00:05:17.131 { 00:05:17.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.131 "dma_device_type": 2 00:05:17.131 } 00:05:17.131 ], 00:05:17.131 "driver_specific": { 00:05:17.131 "passthru": { 00:05:17.131 "name": "Passthru0", 00:05:17.131 "base_bdev_name": "Malloc0" 00:05:17.131 } 00:05:17.131 } 00:05:17.131 } 00:05:17.131 ]' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc0 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.131 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:17.131 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:17.391 15:44:22 rpc.rpc_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:17.391 00:05:17.391 real 0m0.288s 00:05:17.391 user 0m0.188s 00:05:17.391 sys 0m0.034s 00:05:17.391 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 ************************************ 00:05:17.391 END TEST rpc_integrity 00:05:17.391 ************************************ 00:05:17.391 15:44:22 rpc -- rpc/rpc.sh@74 -- # run_test rpc_plugins rpc_plugins 00:05:17.391 15:44:22 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:17.391 15:44:22 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:17.391 15:44:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 ************************************ 00:05:17.391 START TEST rpc_plugins 00:05:17.391 ************************************ 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@1124 -- # rpc_plugins 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # rpc_cmd --plugin rpc_plugin create_malloc 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@30 -- # malloc=Malloc1 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # rpc_cmd bdev_get_bdevs 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@31 -- # bdevs='[ 00:05:17.391 { 00:05:17.391 "name": "Malloc1", 00:05:17.391 "aliases": [ 00:05:17.391 "460db0f6-c34f-475c-8b90-5ff1f0619563" 00:05:17.391 ], 00:05:17.391 "product_name": "Malloc disk", 00:05:17.391 "block_size": 4096, 00:05:17.391 "num_blocks": 256, 00:05:17.391 "uuid": "460db0f6-c34f-475c-8b90-5ff1f0619563", 00:05:17.391 "assigned_rate_limits": { 00:05:17.391 "rw_ios_per_sec": 0, 00:05:17.391 "rw_mbytes_per_sec": 0, 00:05:17.391 "r_mbytes_per_sec": 0, 00:05:17.391 "w_mbytes_per_sec": 0 00:05:17.391 }, 00:05:17.391 "claimed": false, 00:05:17.391 "zoned": false, 00:05:17.391 "supported_io_types": { 00:05:17.391 "read": true, 00:05:17.391 "write": true, 00:05:17.391 "unmap": true, 00:05:17.391 "write_zeroes": true, 00:05:17.391 "flush": true, 00:05:17.391 "reset": true, 00:05:17.391 "compare": false, 00:05:17.391 "compare_and_write": false, 00:05:17.391 "abort": true, 00:05:17.391 "nvme_admin": false, 00:05:17.391 "nvme_io": false 00:05:17.391 }, 00:05:17.391 "memory_domains": [ 00:05:17.391 { 00:05:17.391 "dma_device_id": "system", 00:05:17.391 "dma_device_type": 1 00:05:17.391 }, 00:05:17.391 { 00:05:17.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.391 "dma_device_type": 2 00:05:17.391 } 00:05:17.391 ], 00:05:17.391 "driver_specific": {} 00:05:17.391 } 00:05:17.391 ]' 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # jq length 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@32 -- # '[' 1 == 1 ']' 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@34 -- # rpc_cmd --plugin rpc_plugin delete_malloc Malloc1 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # rpc_cmd bdev_get_bdevs 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@35 -- # bdevs='[]' 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # jq length 00:05:17.391 15:44:22 rpc.rpc_plugins -- rpc/rpc.sh@36 -- # '[' 0 == 0 ']' 00:05:17.391 00:05:17.391 real 0m0.140s 00:05:17.391 user 0m0.097s 00:05:17.391 sys 0m0.011s 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:17.391 15:44:22 rpc.rpc_plugins -- common/autotest_common.sh@10 -- # set +x 00:05:17.391 ************************************ 00:05:17.391 END TEST rpc_plugins 00:05:17.391 ************************************ 00:05:17.650 15:44:22 rpc -- rpc/rpc.sh@75 -- # run_test rpc_trace_cmd_test rpc_trace_cmd_test 00:05:17.650 15:44:22 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:17.650 15:44:22 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:17.650 15:44:22 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:17.650 ************************************ 00:05:17.650 START TEST rpc_trace_cmd_test 00:05:17.650 ************************************ 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1124 -- # rpc_trace_cmd_test 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@40 -- # local info 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # rpc_cmd trace_get_info 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.650 15:44:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@42 -- # info='{ 00:05:17.650 "tpoint_shm_path": "/dev/shm/spdk_tgt_trace.pid2583201", 00:05:17.650 "tpoint_group_mask": "0x8", 00:05:17.650 "iscsi_conn": { 00:05:17.650 "mask": "0x2", 00:05:17.650 "tpoint_mask": "0x0" 00:05:17.650 }, 00:05:17.650 "scsi": { 00:05:17.650 "mask": "0x4", 00:05:17.650 "tpoint_mask": "0x0" 00:05:17.650 }, 00:05:17.650 "bdev": { 00:05:17.650 "mask": "0x8", 00:05:17.650 "tpoint_mask": "0xffffffffffffffff" 00:05:17.650 }, 00:05:17.650 "nvmf_rdma": { 00:05:17.650 "mask": "0x10", 00:05:17.650 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "nvmf_tcp": { 00:05:17.651 "mask": "0x20", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "ftl": { 00:05:17.651 "mask": "0x40", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "blobfs": { 00:05:17.651 "mask": "0x80", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "dsa": { 00:05:17.651 "mask": "0x200", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "thread": { 00:05:17.651 "mask": "0x400", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "nvme_pcie": { 00:05:17.651 "mask": "0x800", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "iaa": { 00:05:17.651 "mask": "0x1000", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "nvme_tcp": { 00:05:17.651 "mask": "0x2000", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "bdev_nvme": { 00:05:17.651 "mask": "0x4000", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 }, 00:05:17.651 "sock": { 00:05:17.651 "mask": "0x8000", 00:05:17.651 "tpoint_mask": "0x0" 00:05:17.651 } 00:05:17.651 }' 00:05:17.651 15:44:22 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # jq length 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@43 -- # '[' 16 -gt 2 ']' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # jq 'has("tpoint_group_mask")' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@44 -- # '[' true = true ']' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # jq 'has("tpoint_shm_path")' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@45 -- # '[' true = true ']' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # jq 'has("bdev")' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@46 -- # '[' true = true ']' 00:05:17.651 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # jq -r .bdev.tpoint_mask 00:05:17.910 15:44:23 rpc.rpc_trace_cmd_test -- rpc/rpc.sh@47 -- # '[' 0xffffffffffffffff '!=' 0x0 ']' 00:05:17.910 00:05:17.910 real 0m0.251s 00:05:17.910 user 0m0.210s 00:05:17.910 sys 0m0.031s 00:05:17.910 15:44:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:17.910 15:44:23 rpc.rpc_trace_cmd_test -- common/autotest_common.sh@10 -- # set +x 00:05:17.910 ************************************ 00:05:17.910 END TEST rpc_trace_cmd_test 00:05:17.910 ************************************ 00:05:17.910 15:44:23 rpc -- rpc/rpc.sh@76 -- # [[ 0 -eq 1 ]] 00:05:17.910 15:44:23 rpc -- rpc/rpc.sh@80 -- # rpc=rpc_cmd 00:05:17.910 15:44:23 rpc -- rpc/rpc.sh@81 -- # run_test rpc_daemon_integrity rpc_integrity 00:05:17.910 15:44:23 rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:17.910 15:44:23 rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:17.910 15:44:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:17.910 ************************************ 00:05:17.910 START TEST rpc_daemon_integrity 00:05:17.910 ************************************ 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1124 -- # rpc_integrity 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # rpc_cmd bdev_get_bdevs 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@12 -- # bdevs='[]' 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # jq length 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@13 -- # '[' 0 == 0 ']' 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # rpc_cmd bdev_malloc_create 8 512 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@15 -- # malloc=Malloc2 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # rpc_cmd bdev_get_bdevs 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.910 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@16 -- # bdevs='[ 00:05:17.911 { 00:05:17.911 "name": "Malloc2", 00:05:17.911 "aliases": [ 00:05:17.911 "e4564a48-ce02-44b6-a912-b1a503c538eb" 00:05:17.911 ], 00:05:17.911 "product_name": "Malloc disk", 00:05:17.911 "block_size": 512, 00:05:17.911 "num_blocks": 16384, 00:05:17.911 "uuid": "e4564a48-ce02-44b6-a912-b1a503c538eb", 00:05:17.911 "assigned_rate_limits": { 00:05:17.911 "rw_ios_per_sec": 0, 00:05:17.911 "rw_mbytes_per_sec": 0, 00:05:17.911 "r_mbytes_per_sec": 0, 00:05:17.911 "w_mbytes_per_sec": 0 00:05:17.911 }, 00:05:17.911 "claimed": false, 00:05:17.911 "zoned": false, 00:05:17.911 "supported_io_types": { 00:05:17.911 "read": true, 00:05:17.911 "write": true, 00:05:17.911 "unmap": true, 00:05:17.911 "write_zeroes": true, 00:05:17.911 "flush": true, 00:05:17.911 "reset": true, 00:05:17.911 "compare": false, 00:05:17.911 "compare_and_write": false, 00:05:17.911 "abort": true, 00:05:17.911 "nvme_admin": false, 00:05:17.911 "nvme_io": false 00:05:17.911 }, 00:05:17.911 "memory_domains": [ 00:05:17.911 { 00:05:17.911 "dma_device_id": "system", 00:05:17.911 "dma_device_type": 1 00:05:17.911 }, 00:05:17.911 { 00:05:17.911 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:17.911 "dma_device_type": 2 00:05:17.911 } 00:05:17.911 ], 00:05:17.911 "driver_specific": {} 00:05:17.911 } 00:05:17.911 ]' 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # jq length 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@17 -- # '[' 1 == 1 ']' 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@19 -- # rpc_cmd bdev_passthru_create -b Malloc2 -p Passthru0 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:17.911 [2024-06-10 15:44:23.401335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc2 00:05:17.911 [2024-06-10 15:44:23.401370] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:17.911 [2024-06-10 15:44:23.401385] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1876460 00:05:17.911 [2024-06-10 15:44:23.401395] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:17.911 [2024-06-10 15:44:23.402851] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:17.911 [2024-06-10 15:44:23.402876] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: Passthru0 00:05:17.911 Passthru0 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # rpc_cmd bdev_get_bdevs 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:17.911 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.170 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:18.170 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@20 -- # bdevs='[ 00:05:18.170 { 00:05:18.170 "name": "Malloc2", 00:05:18.170 "aliases": [ 00:05:18.170 "e4564a48-ce02-44b6-a912-b1a503c538eb" 00:05:18.170 ], 00:05:18.170 "product_name": "Malloc disk", 00:05:18.170 "block_size": 512, 00:05:18.170 "num_blocks": 16384, 00:05:18.170 "uuid": "e4564a48-ce02-44b6-a912-b1a503c538eb", 00:05:18.170 "assigned_rate_limits": { 00:05:18.170 "rw_ios_per_sec": 0, 00:05:18.170 "rw_mbytes_per_sec": 0, 00:05:18.170 "r_mbytes_per_sec": 0, 00:05:18.170 "w_mbytes_per_sec": 0 00:05:18.170 }, 00:05:18.170 "claimed": true, 00:05:18.170 "claim_type": "exclusive_write", 00:05:18.170 "zoned": false, 00:05:18.171 "supported_io_types": { 00:05:18.171 "read": true, 00:05:18.171 "write": true, 00:05:18.171 "unmap": true, 00:05:18.171 "write_zeroes": true, 00:05:18.171 "flush": true, 00:05:18.171 "reset": true, 00:05:18.171 "compare": false, 00:05:18.171 "compare_and_write": false, 00:05:18.171 "abort": true, 00:05:18.171 "nvme_admin": false, 00:05:18.171 "nvme_io": false 00:05:18.171 }, 00:05:18.171 "memory_domains": [ 00:05:18.171 { 00:05:18.171 "dma_device_id": "system", 00:05:18.171 "dma_device_type": 1 00:05:18.171 }, 00:05:18.171 { 00:05:18.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.171 "dma_device_type": 2 00:05:18.171 } 00:05:18.171 ], 00:05:18.171 "driver_specific": {} 00:05:18.171 }, 00:05:18.171 { 00:05:18.171 "name": "Passthru0", 00:05:18.171 "aliases": [ 00:05:18.171 "648e5976-2df0-5ab3-b124-be6387f64d43" 00:05:18.171 ], 00:05:18.171 "product_name": "passthru", 00:05:18.171 "block_size": 512, 00:05:18.171 "num_blocks": 16384, 00:05:18.171 "uuid": "648e5976-2df0-5ab3-b124-be6387f64d43", 00:05:18.171 "assigned_rate_limits": { 00:05:18.171 "rw_ios_per_sec": 0, 00:05:18.171 "rw_mbytes_per_sec": 0, 00:05:18.171 "r_mbytes_per_sec": 0, 00:05:18.171 "w_mbytes_per_sec": 0 00:05:18.171 }, 00:05:18.171 "claimed": false, 00:05:18.171 "zoned": false, 00:05:18.171 "supported_io_types": { 00:05:18.171 "read": true, 00:05:18.171 "write": true, 00:05:18.171 "unmap": true, 00:05:18.171 "write_zeroes": true, 00:05:18.171 "flush": true, 00:05:18.171 "reset": true, 00:05:18.171 "compare": false, 00:05:18.171 "compare_and_write": false, 00:05:18.171 "abort": true, 00:05:18.171 "nvme_admin": false, 00:05:18.171 "nvme_io": false 00:05:18.171 }, 00:05:18.171 "memory_domains": [ 00:05:18.171 { 00:05:18.171 "dma_device_id": "system", 00:05:18.171 "dma_device_type": 1 00:05:18.171 }, 00:05:18.171 { 00:05:18.171 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:05:18.171 "dma_device_type": 2 00:05:18.171 } 00:05:18.171 ], 00:05:18.171 "driver_specific": { 00:05:18.171 "passthru": { 00:05:18.171 "name": "Passthru0", 00:05:18.171 "base_bdev_name": "Malloc2" 00:05:18.171 } 00:05:18.171 } 00:05:18.171 } 00:05:18.171 ]' 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # jq length 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@21 -- # '[' 2 == 2 ']' 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@23 -- # rpc_cmd bdev_passthru_delete Passthru0 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@24 -- # rpc_cmd bdev_malloc_delete Malloc2 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # rpc_cmd bdev_get_bdevs 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@25 -- # bdevs='[]' 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # jq length 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- rpc/rpc.sh@26 -- # '[' 0 == 0 ']' 00:05:18.171 00:05:18.171 real 0m0.277s 00:05:18.171 user 0m0.189s 00:05:18.171 sys 0m0.026s 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:18.171 15:44:23 rpc.rpc_daemon_integrity -- common/autotest_common.sh@10 -- # set +x 00:05:18.171 ************************************ 00:05:18.171 END TEST rpc_daemon_integrity 00:05:18.171 ************************************ 00:05:18.171 15:44:23 rpc -- rpc/rpc.sh@83 -- # trap - SIGINT SIGTERM EXIT 00:05:18.171 15:44:23 rpc -- rpc/rpc.sh@84 -- # killprocess 2583201 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@949 -- # '[' -z 2583201 ']' 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@953 -- # kill -0 2583201 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@954 -- # uname 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2583201 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2583201' 00:05:18.171 killing process with pid 2583201 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@968 -- # kill 2583201 00:05:18.171 15:44:23 rpc -- common/autotest_common.sh@973 -- # wait 2583201 00:05:18.739 00:05:18.739 real 0m2.698s 00:05:18.739 user 0m3.539s 00:05:18.739 sys 0m0.729s 00:05:18.739 15:44:23 rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:18.739 15:44:23 rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.739 ************************************ 00:05:18.739 END TEST rpc 00:05:18.739 ************************************ 00:05:18.739 15:44:23 -- spdk/autotest.sh@170 -- # run_test skip_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:18.739 15:44:23 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:18.739 15:44:23 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:18.739 15:44:23 -- common/autotest_common.sh@10 -- # set +x 00:05:18.739 ************************************ 00:05:18.739 START TEST skip_rpc 00:05:18.739 ************************************ 00:05:18.739 15:44:24 skip_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/skip_rpc.sh 00:05:18.739 * Looking for test storage... 00:05:18.739 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc 00:05:18.739 15:44:24 skip_rpc -- rpc/skip_rpc.sh@11 -- # CONFIG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:18.739 15:44:24 skip_rpc -- rpc/skip_rpc.sh@12 -- # LOG_PATH=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:18.739 15:44:24 skip_rpc -- rpc/skip_rpc.sh@73 -- # run_test skip_rpc test_skip_rpc 00:05:18.739 15:44:24 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:18.739 15:44:24 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:18.739 15:44:24 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:18.739 ************************************ 00:05:18.739 START TEST skip_rpc 00:05:18.739 ************************************ 00:05:18.739 15:44:24 skip_rpc.skip_rpc -- common/autotest_common.sh@1124 -- # test_skip_rpc 00:05:18.739 15:44:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@16 -- # local spdk_pid=2583825 00:05:18.739 15:44:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@18 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:18.740 15:44:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 00:05:18.740 15:44:24 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@19 -- # sleep 5 00:05:18.740 [2024-06-10 15:44:24.208758] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:18.740 [2024-06-10 15:44:24.208816] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2583825 ] 00:05:18.999 [2024-06-10 15:44:24.308021] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:18.999 [2024-06-10 15:44:24.400004] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@21 -- # NOT rpc_cmd spdk_get_version 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@649 -- # local es=0 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@651 -- # valid_exec_arg rpc_cmd spdk_get_version 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@637 -- # local arg=rpc_cmd 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # type -t rpc_cmd 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # rpc_cmd spdk_get_version 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@652 -- # es=1 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@22 -- # trap - SIGINT SIGTERM EXIT 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- rpc/skip_rpc.sh@23 -- # killprocess 2583825 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@949 -- # '[' -z 2583825 ']' 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@953 -- # kill -0 2583825 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # uname 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2583825 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2583825' 00:05:24.274 killing process with pid 2583825 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@968 -- # kill 2583825 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@973 -- # wait 2583825 00:05:24.274 00:05:24.274 real 0m5.401s 00:05:24.274 user 0m5.104s 00:05:24.274 sys 0m0.303s 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:24.274 15:44:29 skip_rpc.skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:24.274 ************************************ 00:05:24.274 END TEST skip_rpc 00:05:24.274 ************************************ 00:05:24.274 15:44:29 skip_rpc -- rpc/skip_rpc.sh@74 -- # run_test skip_rpc_with_json test_skip_rpc_with_json 00:05:24.274 15:44:29 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:24.274 15:44:29 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:24.274 15:44:29 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:24.274 ************************************ 00:05:24.274 START TEST skip_rpc_with_json 00:05:24.274 ************************************ 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_json 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@44 -- # gen_json_config 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@28 -- # local spdk_pid=2584760 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@30 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@31 -- # waitforlisten 2584760 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@830 -- # '[' -z 2584760 ']' 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:24.274 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:24.274 15:44:29 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:24.274 [2024-06-10 15:44:29.637890] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:24.274 [2024-06-10 15:44:29.637926] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2584760 ] 00:05:24.274 [2024-06-10 15:44:29.722767] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:24.534 [2024-06-10 15:44:29.818674] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@863 -- # return 0 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_get_transports --trtype tcp 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:24.793 [2024-06-10 15:44:30.070160] nvmf_rpc.c:2558:rpc_nvmf_get_transports: *ERROR*: transport 'tcp' does not exist 00:05:24.793 request: 00:05:24.793 { 00:05:24.793 "trtype": "tcp", 00:05:24.793 "method": "nvmf_get_transports", 00:05:24.793 "req_id": 1 00:05:24.793 } 00:05:24.793 Got JSON-RPC error response 00:05:24.793 response: 00:05:24.793 { 00:05:24.793 "code": -19, 00:05:24.793 "message": "No such device" 00:05:24.793 } 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 1 == 0 ]] 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@34 -- # rpc_cmd nvmf_create_transport -t tcp 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:24.793 [2024-06-10 15:44:30.082292] tcp.c: 724:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@36 -- # rpc_cmd save_config 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@560 -- # xtrace_disable 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:05:24.793 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@37 -- # cat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:24.793 { 00:05:24.793 "subsystems": [ 00:05:24.793 { 00:05:24.793 "subsystem": "keyring", 00:05:24.793 "config": [] 00:05:24.793 }, 00:05:24.793 { 00:05:24.793 "subsystem": "iobuf", 00:05:24.793 "config": [ 00:05:24.793 { 00:05:24.793 "method": "iobuf_set_options", 00:05:24.793 "params": { 00:05:24.793 "small_pool_count": 8192, 00:05:24.793 "large_pool_count": 1024, 00:05:24.793 "small_bufsize": 8192, 00:05:24.794 "large_bufsize": 135168 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "sock", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "sock_set_default_impl", 00:05:24.794 "params": { 00:05:24.794 "impl_name": "posix" 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "sock_impl_set_options", 00:05:24.794 "params": { 00:05:24.794 "impl_name": "ssl", 00:05:24.794 "recv_buf_size": 4096, 00:05:24.794 "send_buf_size": 4096, 00:05:24.794 "enable_recv_pipe": true, 00:05:24.794 "enable_quickack": false, 00:05:24.794 "enable_placement_id": 0, 00:05:24.794 "enable_zerocopy_send_server": true, 00:05:24.794 "enable_zerocopy_send_client": false, 00:05:24.794 "zerocopy_threshold": 0, 00:05:24.794 "tls_version": 0, 00:05:24.794 "enable_ktls": false 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "sock_impl_set_options", 00:05:24.794 "params": { 00:05:24.794 "impl_name": "posix", 00:05:24.794 "recv_buf_size": 2097152, 00:05:24.794 "send_buf_size": 2097152, 00:05:24.794 "enable_recv_pipe": true, 00:05:24.794 "enable_quickack": false, 00:05:24.794 "enable_placement_id": 0, 00:05:24.794 "enable_zerocopy_send_server": true, 00:05:24.794 "enable_zerocopy_send_client": false, 00:05:24.794 "zerocopy_threshold": 0, 00:05:24.794 "tls_version": 0, 00:05:24.794 "enable_ktls": false 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "vmd", 00:05:24.794 "config": [] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "accel", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "accel_set_options", 00:05:24.794 "params": { 00:05:24.794 "small_cache_size": 128, 00:05:24.794 "large_cache_size": 16, 00:05:24.794 "task_count": 2048, 00:05:24.794 "sequence_count": 2048, 00:05:24.794 "buf_count": 2048 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "bdev", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "bdev_set_options", 00:05:24.794 "params": { 00:05:24.794 "bdev_io_pool_size": 65535, 00:05:24.794 "bdev_io_cache_size": 256, 00:05:24.794 "bdev_auto_examine": true, 00:05:24.794 "iobuf_small_cache_size": 128, 00:05:24.794 "iobuf_large_cache_size": 16 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "bdev_raid_set_options", 00:05:24.794 "params": { 00:05:24.794 "process_window_size_kb": 1024 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "bdev_iscsi_set_options", 00:05:24.794 "params": { 00:05:24.794 "timeout_sec": 30 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "bdev_nvme_set_options", 00:05:24.794 "params": { 00:05:24.794 "action_on_timeout": "none", 00:05:24.794 "timeout_us": 0, 00:05:24.794 "timeout_admin_us": 0, 00:05:24.794 "keep_alive_timeout_ms": 10000, 00:05:24.794 "arbitration_burst": 0, 00:05:24.794 "low_priority_weight": 0, 00:05:24.794 "medium_priority_weight": 0, 00:05:24.794 "high_priority_weight": 0, 00:05:24.794 "nvme_adminq_poll_period_us": 10000, 00:05:24.794 "nvme_ioq_poll_period_us": 0, 00:05:24.794 "io_queue_requests": 0, 00:05:24.794 "delay_cmd_submit": true, 00:05:24.794 "transport_retry_count": 4, 00:05:24.794 "bdev_retry_count": 3, 00:05:24.794 "transport_ack_timeout": 0, 00:05:24.794 "ctrlr_loss_timeout_sec": 0, 00:05:24.794 "reconnect_delay_sec": 0, 00:05:24.794 "fast_io_fail_timeout_sec": 0, 00:05:24.794 "disable_auto_failback": false, 00:05:24.794 "generate_uuids": false, 00:05:24.794 "transport_tos": 0, 00:05:24.794 "nvme_error_stat": false, 00:05:24.794 "rdma_srq_size": 0, 00:05:24.794 "io_path_stat": false, 00:05:24.794 "allow_accel_sequence": false, 00:05:24.794 "rdma_max_cq_size": 0, 00:05:24.794 "rdma_cm_event_timeout_ms": 0, 00:05:24.794 "dhchap_digests": [ 00:05:24.794 "sha256", 00:05:24.794 "sha384", 00:05:24.794 "sha512" 00:05:24.794 ], 00:05:24.794 "dhchap_dhgroups": [ 00:05:24.794 "null", 00:05:24.794 "ffdhe2048", 00:05:24.794 "ffdhe3072", 00:05:24.794 "ffdhe4096", 00:05:24.794 "ffdhe6144", 00:05:24.794 "ffdhe8192" 00:05:24.794 ] 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "bdev_nvme_set_hotplug", 00:05:24.794 "params": { 00:05:24.794 "period_us": 100000, 00:05:24.794 "enable": false 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "bdev_wait_for_examine" 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "scsi", 00:05:24.794 "config": null 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "scheduler", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "framework_set_scheduler", 00:05:24.794 "params": { 00:05:24.794 "name": "static" 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "vhost_scsi", 00:05:24.794 "config": [] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "vhost_blk", 00:05:24.794 "config": [] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "ublk", 00:05:24.794 "config": [] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "nbd", 00:05:24.794 "config": [] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "nvmf", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "nvmf_set_config", 00:05:24.794 "params": { 00:05:24.794 "discovery_filter": "match_any", 00:05:24.794 "admin_cmd_passthru": { 00:05:24.794 "identify_ctrlr": false 00:05:24.794 } 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "nvmf_set_max_subsystems", 00:05:24.794 "params": { 00:05:24.794 "max_subsystems": 1024 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "nvmf_set_crdt", 00:05:24.794 "params": { 00:05:24.794 "crdt1": 0, 00:05:24.794 "crdt2": 0, 00:05:24.794 "crdt3": 0 00:05:24.794 } 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "method": "nvmf_create_transport", 00:05:24.794 "params": { 00:05:24.794 "trtype": "TCP", 00:05:24.794 "max_queue_depth": 128, 00:05:24.794 "max_io_qpairs_per_ctrlr": 127, 00:05:24.794 "in_capsule_data_size": 4096, 00:05:24.794 "max_io_size": 131072, 00:05:24.794 "io_unit_size": 131072, 00:05:24.794 "max_aq_depth": 128, 00:05:24.794 "num_shared_buffers": 511, 00:05:24.794 "buf_cache_size": 4294967295, 00:05:24.794 "dif_insert_or_strip": false, 00:05:24.794 "zcopy": false, 00:05:24.794 "c2h_success": true, 00:05:24.794 "sock_priority": 0, 00:05:24.794 "abort_timeout_sec": 1, 00:05:24.794 "ack_timeout": 0, 00:05:24.794 "data_wr_pool_size": 0 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 }, 00:05:24.794 { 00:05:24.794 "subsystem": "iscsi", 00:05:24.794 "config": [ 00:05:24.794 { 00:05:24.794 "method": "iscsi_set_options", 00:05:24.794 "params": { 00:05:24.794 "node_base": "iqn.2016-06.io.spdk", 00:05:24.794 "max_sessions": 128, 00:05:24.794 "max_connections_per_session": 2, 00:05:24.794 "max_queue_depth": 64, 00:05:24.794 "default_time2wait": 2, 00:05:24.794 "default_time2retain": 20, 00:05:24.794 "first_burst_length": 8192, 00:05:24.794 "immediate_data": true, 00:05:24.794 "allow_duplicated_isid": false, 00:05:24.794 "error_recovery_level": 0, 00:05:24.794 "nop_timeout": 60, 00:05:24.794 "nop_in_interval": 30, 00:05:24.794 "disable_chap": false, 00:05:24.794 "require_chap": false, 00:05:24.794 "mutual_chap": false, 00:05:24.794 "chap_group": 0, 00:05:24.794 "max_large_datain_per_connection": 64, 00:05:24.794 "max_r2t_per_connection": 4, 00:05:24.794 "pdu_pool_size": 36864, 00:05:24.794 "immediate_data_pool_size": 16384, 00:05:24.794 "data_out_pool_size": 2048 00:05:24.794 } 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 } 00:05:24.794 ] 00:05:24.794 } 00:05:24.794 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@39 -- # trap - SIGINT SIGTERM EXIT 00:05:24.794 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@40 -- # killprocess 2584760 00:05:24.794 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 2584760 ']' 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 2584760 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2584760 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2584760' 00:05:24.795 killing process with pid 2584760 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 2584760 00:05:24.795 15:44:30 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 2584760 00:05:25.363 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@47 -- # local spdk_pid=2584991 00:05:25.363 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@48 -- # sleep 5 00:05:25.363 15:44:30 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@50 -- # killprocess 2584991 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@949 -- # '[' -z 2584991 ']' 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@953 -- # kill -0 2584991 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # uname 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2584991 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2584991' 00:05:30.640 killing process with pid 2584991 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@968 -- # kill 2584991 00:05:30.640 15:44:35 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@973 -- # wait 2584991 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@51 -- # grep -q 'TCP Transport Init' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_json -- rpc/skip_rpc.sh@52 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/log.txt 00:05:30.640 00:05:30.640 real 0m6.421s 00:05:30.640 user 0m6.085s 00:05:30.640 sys 0m0.634s 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_json -- common/autotest_common.sh@10 -- # set +x 00:05:30.640 ************************************ 00:05:30.640 END TEST skip_rpc_with_json 00:05:30.640 ************************************ 00:05:30.640 15:44:36 skip_rpc -- rpc/skip_rpc.sh@75 -- # run_test skip_rpc_with_delay test_skip_rpc_with_delay 00:05:30.640 15:44:36 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:30.640 15:44:36 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:30.640 15:44:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:30.640 ************************************ 00:05:30.640 START TEST skip_rpc_with_delay 00:05:30.640 ************************************ 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1124 -- # test_skip_rpc_with_delay 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- rpc/skip_rpc.sh@57 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@649 -- # local es=0 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:30.640 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --no-rpc-server -m 0x1 --wait-for-rpc 00:05:30.899 [2024-06-10 15:44:36.155162] app.c: 832:spdk_app_start: *ERROR*: Cannot use '--wait-for-rpc' if no RPC server is going to be started. 00:05:30.899 [2024-06-10 15:44:36.155239] app.c: 711:unclaim_cpu_cores: *ERROR*: Failed to unlink lock fd for core 0, errno: 2 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@652 -- # es=1 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:30.899 00:05:30.899 real 0m0.085s 00:05:30.899 user 0m0.061s 00:05:30.899 sys 0m0.023s 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:30.899 15:44:36 skip_rpc.skip_rpc_with_delay -- common/autotest_common.sh@10 -- # set +x 00:05:30.899 ************************************ 00:05:30.899 END TEST skip_rpc_with_delay 00:05:30.899 ************************************ 00:05:30.899 15:44:36 skip_rpc -- rpc/skip_rpc.sh@77 -- # uname 00:05:30.899 15:44:36 skip_rpc -- rpc/skip_rpc.sh@77 -- # '[' Linux '!=' FreeBSD ']' 00:05:30.899 15:44:36 skip_rpc -- rpc/skip_rpc.sh@78 -- # run_test exit_on_failed_rpc_init test_exit_on_failed_rpc_init 00:05:30.899 15:44:36 skip_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:30.899 15:44:36 skip_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:30.899 15:44:36 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:30.899 ************************************ 00:05:30.899 START TEST exit_on_failed_rpc_init 00:05:30.899 ************************************ 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1124 -- # test_exit_on_failed_rpc_init 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@62 -- # local spdk_pid=2585945 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@63 -- # waitforlisten 2585945 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@830 -- # '[' -z 2585945 ']' 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:30.899 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:05:30.899 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:05:30.900 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:30.900 15:44:36 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:30.900 [2024-06-10 15:44:36.307079] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:30.900 [2024-06-10 15:44:36.307133] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2585945 ] 00:05:30.900 [2024-06-10 15:44:36.395606] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:31.159 [2024-06-10 15:44:36.489971] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@863 -- # return 0 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@65 -- # trap 'killprocess $spdk_pid; exit 1' SIGINT SIGTERM EXIT 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@67 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@649 -- # local es=0 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt ]] 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x2 00:05:32.096 [2024-06-10 15:44:37.326353] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:32.096 [2024-06-10 15:44:37.326413] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2586175 ] 00:05:32.096 [2024-06-10 15:44:37.416431] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:32.096 [2024-06-10 15:44:37.506170] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:05:32.096 [2024-06-10 15:44:37.506251] rpc.c: 180:_spdk_rpc_listen: *ERROR*: RPC Unix domain socket path /var/tmp/spdk.sock in use. Specify another. 00:05:32.096 [2024-06-10 15:44:37.506264] rpc.c: 166:spdk_rpc_initialize: *ERROR*: Unable to start RPC service at /var/tmp/spdk.sock 00:05:32.096 [2024-06-10 15:44:37.506273] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@652 -- # es=234 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@661 -- # es=106 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@662 -- # case "$es" in 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@669 -- # es=1 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@69 -- # trap - SIGINT SIGTERM EXIT 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- rpc/skip_rpc.sh@70 -- # killprocess 2585945 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@949 -- # '[' -z 2585945 ']' 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@953 -- # kill -0 2585945 00:05:32.096 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # uname 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2585945 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2585945' 00:05:32.355 killing process with pid 2585945 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@968 -- # kill 2585945 00:05:32.355 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@973 -- # wait 2585945 00:05:32.614 00:05:32.614 real 0m1.740s 00:05:32.614 user 0m2.126s 00:05:32.614 sys 0m0.469s 00:05:32.614 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:32.614 15:44:37 skip_rpc.exit_on_failed_rpc_init -- common/autotest_common.sh@10 -- # set +x 00:05:32.614 ************************************ 00:05:32.614 END TEST exit_on_failed_rpc_init 00:05:32.614 ************************************ 00:05:32.614 15:44:38 skip_rpc -- rpc/skip_rpc.sh@81 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc/config.json 00:05:32.614 00:05:32.614 real 0m13.995s 00:05:32.614 user 0m13.501s 00:05:32.614 sys 0m1.680s 00:05:32.614 15:44:38 skip_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:32.614 15:44:38 skip_rpc -- common/autotest_common.sh@10 -- # set +x 00:05:32.614 ************************************ 00:05:32.614 END TEST skip_rpc 00:05:32.614 ************************************ 00:05:32.614 15:44:38 -- spdk/autotest.sh@171 -- # run_test rpc_client /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:32.614 15:44:38 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:32.614 15:44:38 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:32.614 15:44:38 -- common/autotest_common.sh@10 -- # set +x 00:05:32.614 ************************************ 00:05:32.614 START TEST rpc_client 00:05:32.614 ************************************ 00:05:32.614 15:44:38 rpc_client -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client.sh 00:05:32.874 * Looking for test storage... 00:05:32.874 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client 00:05:32.874 15:44:38 rpc_client -- rpc_client/rpc_client.sh@10 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_client/rpc_client_test 00:05:32.874 OK 00:05:32.874 15:44:38 rpc_client -- rpc_client/rpc_client.sh@12 -- # trap - SIGINT SIGTERM EXIT 00:05:32.874 00:05:32.874 real 0m0.119s 00:05:32.874 user 0m0.060s 00:05:32.874 sys 0m0.067s 00:05:32.874 15:44:38 rpc_client -- common/autotest_common.sh@1125 -- # xtrace_disable 00:05:32.874 15:44:38 rpc_client -- common/autotest_common.sh@10 -- # set +x 00:05:32.874 ************************************ 00:05:32.874 END TEST rpc_client 00:05:32.874 ************************************ 00:05:32.874 15:44:38 -- spdk/autotest.sh@172 -- # run_test json_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:32.874 15:44:38 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:05:32.874 15:44:38 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:05:32.874 15:44:38 -- common/autotest_common.sh@10 -- # set +x 00:05:32.874 ************************************ 00:05:32.874 START TEST json_config 00:05:32.874 ************************************ 00:05:32.874 15:44:38 json_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config.sh 00:05:32.874 15:44:38 json_config -- json_config/json_config.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@7 -- # uname -s 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:05:32.874 15:44:38 json_config -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:05:32.874 15:44:38 json_config -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:05:32.874 15:44:38 json_config -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:05:32.874 15:44:38 json_config -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:05:32.874 15:44:38 json_config -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:32.875 15:44:38 json_config -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:32.875 15:44:38 json_config -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:32.875 15:44:38 json_config -- paths/export.sh@5 -- # export PATH 00:05:32.875 15:44:38 json_config -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@47 -- # : 0 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:05:32.875 15:44:38 json_config -- nvmf/common.sh@51 -- # have_pci_nics=0 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@11 -- # [[ 0 -eq 1 ]] 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -ne 1 ]] 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@15 -- # [[ 0 -eq 1 ]] 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@26 -- # (( SPDK_TEST_BLOCKDEV + SPDK_TEST_ISCSI + SPDK_TEST_NVMF + SPDK_TEST_VHOST + SPDK_TEST_VHOST_INIT + SPDK_TEST_RBD == 0 )) 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@31 -- # app_pid=(['target']='' ['initiator']='') 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@31 -- # declare -A app_pid 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@32 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock' ['initiator']='/var/tmp/spdk_initiator.sock') 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@32 -- # declare -A app_socket 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@33 -- # app_params=(['target']='-m 0x1 -s 1024' ['initiator']='-m 0x2 -g -u -s 1024') 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@33 -- # declare -A app_params 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@34 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json' ['initiator']='/var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json') 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@34 -- # declare -A configs_path 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@40 -- # last_event_id=0 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@355 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@356 -- # echo 'INFO: JSON configuration test init' 00:05:32.875 INFO: JSON configuration test init 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@357 -- # json_config_test_init 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@262 -- # timing_enter json_config_test_init 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@263 -- # timing_enter json_config_setup_target 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:32.875 15:44:38 json_config -- json_config/json_config.sh@265 -- # json_config_test_start_app target --wait-for-rpc 00:05:32.875 15:44:38 json_config -- json_config/common.sh@9 -- # local app=target 00:05:32.875 15:44:38 json_config -- json_config/common.sh@10 -- # shift 00:05:32.875 15:44:38 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:32.875 15:44:38 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:32.875 15:44:38 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:32.875 15:44:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:32.875 15:44:38 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:32.875 15:44:38 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2586510 00:05:32.875 15:44:38 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:32.875 Waiting for target to run... 00:05:32.875 15:44:38 json_config -- json_config/common.sh@25 -- # waitforlisten 2586510 /var/tmp/spdk_tgt.sock 00:05:32.875 15:44:38 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --wait-for-rpc 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@830 -- # '[' -z 2586510 ']' 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:32.875 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:32.875 15:44:38 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:33.134 [2024-06-10 15:44:38.445226] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:33.134 [2024-06-10 15:44:38.445282] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2586510 ] 00:05:33.702 [2024-06-10 15:44:38.939619] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:33.702 [2024-06-10 15:44:39.047031] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:33.961 15:44:39 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:33.961 15:44:39 json_config -- common/autotest_common.sh@863 -- # return 0 00:05:33.961 15:44:39 json_config -- json_config/common.sh@26 -- # echo '' 00:05:33.961 00:05:33.961 15:44:39 json_config -- json_config/json_config.sh@269 -- # create_accel_config 00:05:33.961 15:44:39 json_config -- json_config/json_config.sh@93 -- # timing_enter create_accel_config 00:05:33.961 15:44:39 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:33.961 15:44:39 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:33.961 15:44:39 json_config -- json_config/json_config.sh@95 -- # [[ 1 -eq 1 ]] 00:05:33.961 15:44:39 json_config -- json_config/json_config.sh@96 -- # tgt_rpc dpdk_cryptodev_scan_accel_module 00:05:33.961 15:44:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock dpdk_cryptodev_scan_accel_module 00:05:34.220 15:44:39 json_config -- json_config/json_config.sh@97 -- # tgt_rpc accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:34.220 15:44:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o encrypt -m dpdk_cryptodev 00:05:34.479 [2024-06-10 15:44:39.805364] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:34.479 15:44:39 json_config -- json_config/json_config.sh@98 -- # tgt_rpc accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:34.479 15:44:39 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock accel_assign_opc -o decrypt -m dpdk_cryptodev 00:05:34.738 [2024-06-10 15:44:40.066046] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:34.738 15:44:40 json_config -- json_config/json_config.sh@101 -- # timing_exit create_accel_config 00:05:34.738 15:44:40 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:34.738 15:44:40 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:34.738 15:44:40 json_config -- json_config/json_config.sh@273 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --json-with-subsystems 00:05:34.738 15:44:40 json_config -- json_config/json_config.sh@274 -- # tgt_rpc load_config 00:05:34.738 15:44:40 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock load_config 00:05:35.011 [2024-06-10 15:44:40.379897] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:40.278 15:44:45 json_config -- json_config/json_config.sh@276 -- # tgt_check_notification_types 00:05:40.278 15:44:45 json_config -- json_config/json_config.sh@43 -- # timing_enter tgt_check_notification_types 00:05:40.278 15:44:45 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:40.278 15:44:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.278 15:44:45 json_config -- json_config/json_config.sh@45 -- # local ret=0 00:05:40.278 15:44:45 json_config -- json_config/json_config.sh@46 -- # enabled_types=('bdev_register' 'bdev_unregister') 00:05:40.278 15:44:45 json_config -- json_config/json_config.sh@46 -- # local enabled_types 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@48 -- # tgt_rpc notify_get_types 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@48 -- # jq -r '.[]' 00:05:40.279 15:44:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_types 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@48 -- # get_types=('bdev_register' 'bdev_unregister') 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@48 -- # local get_types 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@49 -- # [[ bdev_register bdev_unregister != \b\d\e\v\_\r\e\g\i\s\t\e\r\ \b\d\e\v\_\u\n\r\e\g\i\s\t\e\r ]] 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@54 -- # timing_exit tgt_check_notification_types 00:05:40.279 15:44:45 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:40.279 15:44:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@55 -- # return 0 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@278 -- # [[ 1 -eq 1 ]] 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@279 -- # create_bdev_subsystem_config 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@105 -- # timing_enter create_bdev_subsystem_config 00:05:40.279 15:44:45 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:40.279 15:44:45 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@107 -- # expected_notifications=() 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@107 -- # local expected_notifications 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@111 -- # expected_notifications+=($(get_notifications)) 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@111 -- # get_notifications 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:40.279 15:44:45 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:40.279 15:44:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@113 -- # [[ 1 -eq 1 ]] 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@114 -- # local lvol_store_base_bdev=Nvme0n1 00:05:40.537 15:44:45 json_config -- json_config/json_config.sh@116 -- # tgt_rpc bdev_split_create Nvme0n1 2 00:05:40.537 15:44:45 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Nvme0n1 2 00:05:40.796 Nvme0n1p0 Nvme0n1p1 00:05:40.796 15:44:46 json_config -- json_config/json_config.sh@117 -- # tgt_rpc bdev_split_create Malloc0 3 00:05:40.796 15:44:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_split_create Malloc0 3 00:05:41.055 [2024-06-10 15:44:46.402171] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:41.055 [2024-06-10 15:44:46.402226] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:41.055 00:05:41.055 15:44:46 json_config -- json_config/json_config.sh@118 -- # tgt_rpc bdev_malloc_create 8 4096 --name Malloc3 00:05:41.055 15:44:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 4096 --name Malloc3 00:05:41.313 Malloc3 00:05:41.313 15:44:46 json_config -- json_config/json_config.sh@119 -- # tgt_rpc bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:41.313 15:44:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_passthru_create -b Malloc3 -p PTBdevFromMalloc3 00:05:41.572 [2024-06-10 15:44:46.903621] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:41.572 [2024-06-10 15:44:46.903668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:41.572 [2024-06-10 15:44:46.903686] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bbcaf0 00:05:41.572 [2024-06-10 15:44:46.903696] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:41.572 [2024-06-10 15:44:46.905287] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:41.572 [2024-06-10 15:44:46.905315] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:41.572 PTBdevFromMalloc3 00:05:41.572 15:44:46 json_config -- json_config/json_config.sh@121 -- # tgt_rpc bdev_null_create Null0 32 512 00:05:41.572 15:44:46 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_null_create Null0 32 512 00:05:41.832 Null0 00:05:41.832 15:44:47 json_config -- json_config/json_config.sh@123 -- # tgt_rpc bdev_malloc_create 32 512 --name Malloc0 00:05:41.832 15:44:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 32 512 --name Malloc0 00:05:42.091 Malloc0 00:05:42.091 15:44:47 json_config -- json_config/json_config.sh@124 -- # tgt_rpc bdev_malloc_create 16 4096 --name Malloc1 00:05:42.091 15:44:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 16 4096 --name Malloc1 00:05:42.349 Malloc1 00:05:42.349 15:44:47 json_config -- json_config/json_config.sh@137 -- # expected_notifications+=(bdev_register:${lvol_store_base_bdev}p1 bdev_register:${lvol_store_base_bdev}p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1) 00:05:42.349 15:44:47 json_config -- json_config/json_config.sh@140 -- # dd if=/dev/zero of=/sample_aio bs=1024 count=102400 00:05:42.607 102400+0 records in 00:05:42.607 102400+0 records out 00:05:42.607 104857600 bytes (105 MB, 100 MiB) copied, 0.16011 s, 655 MB/s 00:05:42.607 15:44:47 json_config -- json_config/json_config.sh@141 -- # tgt_rpc bdev_aio_create /sample_aio aio_disk 1024 00:05:42.607 15:44:47 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_aio_create /sample_aio aio_disk 1024 00:05:42.607 aio_disk 00:05:42.864 15:44:48 json_config -- json_config/json_config.sh@142 -- # expected_notifications+=(bdev_register:aio_disk) 00:05:42.864 15:44:48 json_config -- json_config/json_config.sh@147 -- # tgt_rpc bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:42.864 15:44:48 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create_lvstore -c 1048576 Nvme0n1p0 lvs_test 00:05:45.453 dff6d7de-6e2e-42f5-8d69-1f1a823afb8c 00:05:45.453 15:44:50 json_config -- json_config/json_config.sh@154 -- # expected_notifications+=("bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test lvol0 32)" "bdev_register:$(tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32)" "bdev_register:$(tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0)" "bdev_register:$(tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0)") 00:05:45.453 15:44:50 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test lvol0 32 00:05:45.453 15:44:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test lvol0 32 00:05:45.453 15:44:50 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_create -l lvs_test -t lvol1 32 00:05:45.453 15:44:50 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_create -l lvs_test -t lvol1 32 00:05:45.711 15:44:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:45.711 15:44:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_snapshot lvs_test/lvol0 snapshot0 00:05:45.970 15:44:51 json_config -- json_config/json_config.sh@154 -- # tgt_rpc bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:45.970 15:44:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_clone lvs_test/snapshot0 clone0 00:05:46.228 15:44:51 json_config -- json_config/json_config.sh@157 -- # [[ 1 -eq 1 ]] 00:05:46.228 15:44:51 json_config -- json_config/json_config.sh@158 -- # tgt_rpc bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:46.228 15:44:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 1024 --name MallocForCryptoBdev 00:05:46.486 MallocForCryptoBdev 00:05:46.486 15:44:51 json_config -- json_config/json_config.sh@159 -- # lspci -d:37c8 00:05:46.486 15:44:51 json_config -- json_config/json_config.sh@159 -- # wc -l 00:05:46.486 15:44:51 json_config -- json_config/json_config.sh@159 -- # [[ 3 -eq 0 ]] 00:05:46.486 15:44:51 json_config -- json_config/json_config.sh@162 -- # local crypto_driver=crypto_qat 00:05:46.486 15:44:51 json_config -- json_config/json_config.sh@165 -- # tgt_rpc bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:46.486 15:44:51 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_crypto_create MallocForCryptoBdev CryptoMallocBdev -p crypto_qat -k 01234567891234560123456789123456 00:05:46.745 [2024-06-10 15:44:52.060792] vbdev_crypto_rpc.c: 136:rpc_bdev_crypto_create: *WARNING*: "crypto_pmd" parameters is obsolete and ignored 00:05:46.745 CryptoMallocBdev 00:05:46.745 15:44:52 json_config -- json_config/json_config.sh@169 -- # expected_notifications+=(bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev) 00:05:46.745 15:44:52 json_config -- json_config/json_config.sh@172 -- # [[ 0 -eq 1 ]] 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@178 -- # tgt_check_notifications bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 bdev_register:075e815c-a789-45ac-8665-844148696ea5 bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@67 -- # local events_to_check 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@68 -- # local recorded_events 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@71 -- # events_to_check=($(printf '%s\n' "$@" | sort)) 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@71 -- # printf '%s\n' bdev_register:Nvme0n1 bdev_register:Nvme0n1p1 bdev_register:Nvme0n1p0 bdev_register:Malloc3 bdev_register:PTBdevFromMalloc3 bdev_register:Null0 bdev_register:Malloc0 bdev_register:Malloc0p2 bdev_register:Malloc0p1 bdev_register:Malloc0p0 bdev_register:Malloc1 bdev_register:aio_disk bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 bdev_register:075e815c-a789-45ac-8665-844148696ea5 bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 bdev_register:MallocForCryptoBdev bdev_register:CryptoMallocBdev 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@71 -- # sort 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@72 -- # recorded_events=($(get_notifications | sort)) 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@72 -- # get_notifications 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@72 -- # sort 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@59 -- # local ev_type ev_ctx event_id 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@58 -- # tgt_rpc notify_get_notifications -i 0 00:05:46.746 15:44:52 json_config -- json_config/json_config.sh@58 -- # jq -r '.[] | "\(.type):\(.ctx):\(.id)"' 00:05:46.746 15:44:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock notify_get_notifications -i 0 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p1 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Nvme0n1p0 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc3 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:PTBdevFromMalloc3 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Null0 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p2 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p1 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc0p0 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:Malloc1 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:aio_disk 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:075e815c-a789-45ac-8665-844148696ea5 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:MallocForCryptoBdev 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@62 -- # echo bdev_register:CryptoMallocBdev 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # IFS=: 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@61 -- # read -r ev_type ev_ctx event_id 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@74 -- # [[ bdev_register:075e815c-a789-45ac-8665-844148696ea5 bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 bdev_register:aio_disk bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 != \b\d\e\v\_\r\e\g\i\s\t\e\r\:\0\7\5\e\8\1\5\c\-\a\7\8\9\-\4\5\a\c\-\8\6\6\5\-\8\4\4\1\4\8\6\9\6\e\a\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\5\e\1\3\8\4\3\5\-\3\d\5\0\-\4\4\d\d\-\8\3\5\5\-\c\0\0\0\c\3\b\3\b\3\a\6\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\7\f\f\2\8\8\3\4\-\8\b\3\3\-\4\7\a\e\-\8\0\e\f\-\2\9\0\2\b\9\0\b\a\2\1\5\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\a\i\o\_\d\i\s\k\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\b\a\d\7\b\2\2\e\-\8\b\4\1\-\4\3\f\7\-\b\e\3\3\-\2\1\8\a\3\2\3\6\c\a\e\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\C\r\y\p\t\o\M\a\l\l\o\c\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\0\p\2\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\3\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\M\a\l\l\o\c\F\o\r\C\r\y\p\t\o\B\d\e\v\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\u\l\l\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\0\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\N\v\m\e\0\n\1\p\1\ \b\d\e\v\_\r\e\g\i\s\t\e\r\:\P\T\B\d\e\v\F\r\o\m\M\a\l\l\o\c\3 ]] 00:05:47.005 15:44:52 json_config -- json_config/json_config.sh@86 -- # cat 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@86 -- # printf ' %s\n' bdev_register:075e815c-a789-45ac-8665-844148696ea5 bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 bdev_register:aio_disk bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 bdev_register:CryptoMallocBdev bdev_register:Malloc0 bdev_register:Malloc0p0 bdev_register:Malloc0p1 bdev_register:Malloc0p2 bdev_register:Malloc1 bdev_register:Malloc3 bdev_register:MallocForCryptoBdev bdev_register:Null0 bdev_register:Nvme0n1 bdev_register:Nvme0n1p0 bdev_register:Nvme0n1p1 bdev_register:PTBdevFromMalloc3 00:05:47.006 Expected events matched: 00:05:47.006 bdev_register:075e815c-a789-45ac-8665-844148696ea5 00:05:47.006 bdev_register:5e138435-3d50-44dd-8355-c000c3b3b3a6 00:05:47.006 bdev_register:7ff28834-8b33-47ae-80ef-2902b90ba215 00:05:47.006 bdev_register:aio_disk 00:05:47.006 bdev_register:bad7b22e-8b41-43f7-be33-218a3236cae3 00:05:47.006 bdev_register:CryptoMallocBdev 00:05:47.006 bdev_register:Malloc0 00:05:47.006 bdev_register:Malloc0p0 00:05:47.006 bdev_register:Malloc0p1 00:05:47.006 bdev_register:Malloc0p2 00:05:47.006 bdev_register:Malloc1 00:05:47.006 bdev_register:Malloc3 00:05:47.006 bdev_register:MallocForCryptoBdev 00:05:47.006 bdev_register:Null0 00:05:47.006 bdev_register:Nvme0n1 00:05:47.006 bdev_register:Nvme0n1p0 00:05:47.006 bdev_register:Nvme0n1p1 00:05:47.006 bdev_register:PTBdevFromMalloc3 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@180 -- # timing_exit create_bdev_subsystem_config 00:05:47.006 15:44:52 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:47.006 15:44:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@282 -- # [[ 0 -eq 1 ]] 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@286 -- # [[ 0 -eq 1 ]] 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@290 -- # [[ 0 -eq 1 ]] 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@293 -- # timing_exit json_config_setup_target 00:05:47.006 15:44:52 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:47.006 15:44:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@295 -- # [[ 0 -eq 1 ]] 00:05:47.006 15:44:52 json_config -- json_config/json_config.sh@300 -- # tgt_rpc bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:47.006 15:44:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_create 8 512 --name MallocBdevForConfigChangeCheck 00:05:47.265 MallocBdevForConfigChangeCheck 00:05:47.265 15:44:52 json_config -- json_config/json_config.sh@302 -- # timing_exit json_config_test_init 00:05:47.265 15:44:52 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:47.265 15:44:52 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:47.265 15:44:52 json_config -- json_config/json_config.sh@359 -- # tgt_rpc save_config 00:05:47.265 15:44:52 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:47.832 15:44:53 json_config -- json_config/json_config.sh@361 -- # echo 'INFO: shutting down applications...' 00:05:47.832 INFO: shutting down applications... 00:05:47.832 15:44:53 json_config -- json_config/json_config.sh@362 -- # [[ 0 -eq 1 ]] 00:05:47.832 15:44:53 json_config -- json_config/json_config.sh@368 -- # json_config_clear target 00:05:47.832 15:44:53 json_config -- json_config/json_config.sh@332 -- # [[ -n 22 ]] 00:05:47.832 15:44:53 json_config -- json_config/json_config.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py -s /var/tmp/spdk_tgt.sock clear_config 00:05:47.832 [2024-06-10 15:44:53.308778] vbdev_lvol.c: 150:vbdev_lvs_hotremove_cb: *NOTICE*: bdev Nvme0n1p0 being removed: closing lvstore lvs_test 00:05:49.736 Calling clear_iscsi_subsystem 00:05:49.736 Calling clear_nvmf_subsystem 00:05:49.736 Calling clear_nbd_subsystem 00:05:49.736 Calling clear_ublk_subsystem 00:05:49.736 Calling clear_vhost_blk_subsystem 00:05:49.736 Calling clear_vhost_scsi_subsystem 00:05:49.736 Calling clear_bdev_subsystem 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@337 -- # local config_filter=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@343 -- # count=100 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@344 -- # '[' 100 -gt 0 ']' 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method delete_global_parameters 00:05:49.736 15:44:54 json_config -- json_config/json_config.sh@345 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method check_empty 00:05:49.736 15:44:55 json_config -- json_config/json_config.sh@345 -- # break 00:05:49.736 15:44:55 json_config -- json_config/json_config.sh@350 -- # '[' 100 -eq 0 ']' 00:05:49.736 15:44:55 json_config -- json_config/json_config.sh@369 -- # json_config_test_shutdown_app target 00:05:49.736 15:44:55 json_config -- json_config/common.sh@31 -- # local app=target 00:05:49.736 15:44:55 json_config -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:05:49.736 15:44:55 json_config -- json_config/common.sh@35 -- # [[ -n 2586510 ]] 00:05:49.736 15:44:55 json_config -- json_config/common.sh@38 -- # kill -SIGINT 2586510 00:05:49.736 15:44:55 json_config -- json_config/common.sh@40 -- # (( i = 0 )) 00:05:49.736 15:44:55 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:49.736 15:44:55 json_config -- json_config/common.sh@41 -- # kill -0 2586510 00:05:49.736 15:44:55 json_config -- json_config/common.sh@45 -- # sleep 0.5 00:05:50.305 15:44:55 json_config -- json_config/common.sh@40 -- # (( i++ )) 00:05:50.305 15:44:55 json_config -- json_config/common.sh@40 -- # (( i < 30 )) 00:05:50.305 15:44:55 json_config -- json_config/common.sh@41 -- # kill -0 2586510 00:05:50.305 15:44:55 json_config -- json_config/common.sh@42 -- # app_pid["$app"]= 00:05:50.305 15:44:55 json_config -- json_config/common.sh@43 -- # break 00:05:50.305 15:44:55 json_config -- json_config/common.sh@48 -- # [[ -n '' ]] 00:05:50.305 15:44:55 json_config -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:05:50.305 SPDK target shutdown done 00:05:50.305 15:44:55 json_config -- json_config/json_config.sh@371 -- # echo 'INFO: relaunching applications...' 00:05:50.305 INFO: relaunching applications... 00:05:50.305 15:44:55 json_config -- json_config/json_config.sh@372 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:50.305 15:44:55 json_config -- json_config/common.sh@9 -- # local app=target 00:05:50.305 15:44:55 json_config -- json_config/common.sh@10 -- # shift 00:05:50.305 15:44:55 json_config -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:05:50.305 15:44:55 json_config -- json_config/common.sh@13 -- # [[ -z '' ]] 00:05:50.305 15:44:55 json_config -- json_config/common.sh@15 -- # local app_extra_params= 00:05:50.305 15:44:55 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.305 15:44:55 json_config -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:05:50.305 15:44:55 json_config -- json_config/common.sh@22 -- # app_pid["$app"]=2589399 00:05:50.305 15:44:55 json_config -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:05:50.305 Waiting for target to run... 00:05:50.305 15:44:55 json_config -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:50.305 15:44:55 json_config -- json_config/common.sh@25 -- # waitforlisten 2589399 /var/tmp/spdk_tgt.sock 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@830 -- # '[' -z 2589399 ']' 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@835 -- # local max_retries=100 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:05:50.305 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@839 -- # xtrace_disable 00:05:50.305 15:44:55 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:50.305 [2024-06-10 15:44:55.768532] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:05:50.305 [2024-06-10 15:44:55.768596] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2589399 ] 00:05:50.873 [2024-06-10 15:44:56.105245] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:05:50.873 [2024-06-10 15:44:56.190048] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:05:50.873 [2024-06-10 15:44:56.244323] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:05:50.873 [2024-06-10 15:44:56.252365] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:05:50.873 [2024-06-10 15:44:56.260379] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:05:50.873 [2024-06-10 15:44:56.341915] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:05:53.409 [2024-06-10 15:44:58.499543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:53.409 [2024-06-10 15:44:58.499601] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:05:53.409 [2024-06-10 15:44:58.499613] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:53.409 [2024-06-10 15:44:58.507561] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:53.409 [2024-06-10 15:44:58.507587] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Nvme0n1 00:05:53.409 [2024-06-10 15:44:58.515578] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:53.409 [2024-06-10 15:44:58.515601] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:05:53.409 [2024-06-10 15:44:58.523612] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "CryptoMallocBdev_AES_CBC" 00:05:53.409 [2024-06-10 15:44:58.523636] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: MallocForCryptoBdev 00:05:53.409 [2024-06-10 15:44:58.523645] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:05:55.944 [2024-06-10 15:45:01.395575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:05:55.944 [2024-06-10 15:45:01.395619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:05:55.944 [2024-06-10 15:45:01.395633] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x224add0 00:05:55.944 [2024-06-10 15:45:01.395643] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:05:55.944 [2024-06-10 15:45:01.395942] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:05:55.944 [2024-06-10 15:45:01.395968] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: PTBdevFromMalloc3 00:05:56.512 15:45:01 json_config -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:05:56.512 15:45:01 json_config -- common/autotest_common.sh@863 -- # return 0 00:05:56.512 15:45:01 json_config -- json_config/common.sh@26 -- # echo '' 00:05:56.512 00:05:56.512 15:45:01 json_config -- json_config/json_config.sh@373 -- # [[ 0 -eq 1 ]] 00:05:56.512 15:45:01 json_config -- json_config/json_config.sh@377 -- # echo 'INFO: Checking if target configuration is the same...' 00:05:56.512 INFO: Checking if target configuration is the same... 00:05:56.512 15:45:01 json_config -- json_config/json_config.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:56.512 15:45:01 json_config -- json_config/json_config.sh@378 -- # tgt_rpc save_config 00:05:56.512 15:45:01 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:56.512 + '[' 2 -ne 2 ']' 00:05:56.512 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:56.512 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:56.512 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:56.512 +++ basename /dev/fd/62 00:05:56.512 ++ mktemp /tmp/62.XXX 00:05:56.512 + tmp_file_1=/tmp/62.riw 00:05:56.512 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:56.769 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:56.769 + tmp_file_2=/tmp/spdk_tgt_config.json.Omn 00:05:56.769 + ret=0 00:05:56.769 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:57.027 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:57.027 + diff -u /tmp/62.riw /tmp/spdk_tgt_config.json.Omn 00:05:57.027 + echo 'INFO: JSON config files are the same' 00:05:57.027 INFO: JSON config files are the same 00:05:57.027 + rm /tmp/62.riw /tmp/spdk_tgt_config.json.Omn 00:05:57.027 + exit 0 00:05:57.027 15:45:02 json_config -- json_config/json_config.sh@379 -- # [[ 0 -eq 1 ]] 00:05:57.027 15:45:02 json_config -- json_config/json_config.sh@384 -- # echo 'INFO: changing configuration and checking if this can be detected...' 00:05:57.027 INFO: changing configuration and checking if this can be detected... 00:05:57.027 15:45:02 json_config -- json_config/json_config.sh@386 -- # tgt_rpc bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:57.027 15:45:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_malloc_delete MallocBdevForConfigChangeCheck 00:05:57.286 15:45:02 json_config -- json_config/json_config.sh@387 -- # tgt_rpc save_config 00:05:57.286 15:45:02 json_config -- json_config/json_config.sh@387 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh /dev/fd/62 /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:57.286 15:45:02 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock save_config 00:05:57.286 + '[' 2 -ne 2 ']' 00:05:57.286 +++ dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_diff.sh 00:05:57.286 ++ readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/../.. 00:05:57.286 + rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:05:57.286 +++ basename /dev/fd/62 00:05:57.286 ++ mktemp /tmp/62.XXX 00:05:57.286 + tmp_file_1=/tmp/62.flE 00:05:57.286 +++ basename /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:05:57.286 ++ mktemp /tmp/spdk_tgt_config.json.XXX 00:05:57.286 + tmp_file_2=/tmp/spdk_tgt_config.json.mPY 00:05:57.286 + ret=0 00:05:57.286 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:57.855 + /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/config_filter.py -method sort 00:05:57.855 + diff -u /tmp/62.flE /tmp/spdk_tgt_config.json.mPY 00:05:57.855 + ret=1 00:05:57.855 + echo '=== Start of file: /tmp/62.flE ===' 00:05:57.855 + cat /tmp/62.flE 00:05:57.855 + echo '=== End of file: /tmp/62.flE ===' 00:05:57.855 + echo '' 00:05:57.855 + echo '=== Start of file: /tmp/spdk_tgt_config.json.mPY ===' 00:05:57.855 + cat /tmp/spdk_tgt_config.json.mPY 00:05:57.855 + echo '=== End of file: /tmp/spdk_tgt_config.json.mPY ===' 00:05:57.855 + echo '' 00:05:57.855 + rm /tmp/62.flE /tmp/spdk_tgt_config.json.mPY 00:05:57.855 + exit 1 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@391 -- # echo 'INFO: configuration change detected.' 00:05:57.855 INFO: configuration change detected. 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@394 -- # json_config_test_fini 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@306 -- # timing_enter json_config_test_fini 00:05:57.855 15:45:03 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:57.855 15:45:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@307 -- # local ret=0 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@309 -- # [[ -n '' ]] 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@317 -- # [[ -n 2589399 ]] 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@320 -- # cleanup_bdev_subsystem_config 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@184 -- # timing_enter cleanup_bdev_subsystem_config 00:05:57.855 15:45:03 json_config -- common/autotest_common.sh@723 -- # xtrace_disable 00:05:57.855 15:45:03 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@186 -- # [[ 1 -eq 1 ]] 00:05:57.855 15:45:03 json_config -- json_config/json_config.sh@187 -- # tgt_rpc bdev_lvol_delete lvs_test/clone0 00:05:57.855 15:45:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/clone0 00:05:58.114 15:45:03 json_config -- json_config/json_config.sh@188 -- # tgt_rpc bdev_lvol_delete lvs_test/lvol0 00:05:58.114 15:45:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/lvol0 00:05:58.373 15:45:03 json_config -- json_config/json_config.sh@189 -- # tgt_rpc bdev_lvol_delete lvs_test/snapshot0 00:05:58.373 15:45:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete lvs_test/snapshot0 00:05:58.632 15:45:03 json_config -- json_config/json_config.sh@190 -- # tgt_rpc bdev_lvol_delete_lvstore -l lvs_test 00:05:58.632 15:45:03 json_config -- json_config/common.sh@57 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk_tgt.sock bdev_lvol_delete_lvstore -l lvs_test 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@193 -- # uname -s 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@193 -- # [[ Linux = Linux ]] 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@194 -- # rm -f /sample_aio 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@197 -- # [[ 0 -eq 1 ]] 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@201 -- # timing_exit cleanup_bdev_subsystem_config 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@10 -- # set +x 00:05:58.891 15:45:04 json_config -- json_config/json_config.sh@323 -- # killprocess 2589399 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@949 -- # '[' -z 2589399 ']' 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@953 -- # kill -0 2589399 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@954 -- # uname 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2589399 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2589399' 00:05:58.891 killing process with pid 2589399 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@968 -- # kill 2589399 00:05:58.891 15:45:04 json_config -- common/autotest_common.sh@973 -- # wait 2589399 00:06:00.867 15:45:05 json_config -- json_config/json_config.sh@326 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_initiator_config.json /var/jenkins/workspace/crypto-phy-autotest/spdk/spdk_tgt_config.json 00:06:00.867 15:45:05 json_config -- json_config/json_config.sh@327 -- # timing_exit json_config_test_fini 00:06:00.867 15:45:05 json_config -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:00.867 15:45:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:00.867 15:45:05 json_config -- json_config/json_config.sh@328 -- # return 0 00:06:00.867 15:45:05 json_config -- json_config/json_config.sh@396 -- # echo 'INFO: Success' 00:06:00.867 INFO: Success 00:06:00.867 00:06:00.867 real 0m27.687s 00:06:00.867 user 0m33.712s 00:06:00.867 sys 0m3.107s 00:06:00.867 15:45:05 json_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:00.868 15:45:05 json_config -- common/autotest_common.sh@10 -- # set +x 00:06:00.868 ************************************ 00:06:00.868 END TEST json_config 00:06:00.868 ************************************ 00:06:00.868 15:45:05 -- spdk/autotest.sh@173 -- # run_test json_config_extra_key /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:00.868 15:45:05 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:00.868 15:45:05 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:00.868 15:45:05 -- common/autotest_common.sh@10 -- # set +x 00:06:00.868 ************************************ 00:06:00.868 START TEST json_config_extra_key 00:06:00.868 ************************************ 00:06:00.868 15:45:06 json_config_extra_key -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/json_config_extra_key.sh 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@7 -- # uname -s 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:06:00.868 15:45:06 json_config_extra_key -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:06:00.868 15:45:06 json_config_extra_key -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:06:00.868 15:45:06 json_config_extra_key -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:06:00.868 15:45:06 json_config_extra_key -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.868 15:45:06 json_config_extra_key -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.868 15:45:06 json_config_extra_key -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.868 15:45:06 json_config_extra_key -- paths/export.sh@5 -- # export PATH 00:06:00.868 15:45:06 json_config_extra_key -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@47 -- # : 0 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:06:00.868 15:45:06 json_config_extra_key -- nvmf/common.sh@51 -- # have_pci_nics=0 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/common.sh 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # app_pid=(['target']='') 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@17 -- # declare -A app_pid 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # app_socket=(['target']='/var/tmp/spdk_tgt.sock') 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@18 -- # declare -A app_socket 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # app_params=(['target']='-m 0x1 -s 1024') 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@19 -- # declare -A app_params 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # configs_path=(['target']='/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json') 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@20 -- # declare -A configs_path 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@22 -- # trap 'on_error_exit "${FUNCNAME}" "${LINENO}"' ERR 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@24 -- # echo 'INFO: launching applications...' 00:06:00.868 INFO: launching applications... 00:06:00.868 15:45:06 json_config_extra_key -- json_config/json_config_extra_key.sh@25 -- # json_config_test_start_app target --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@9 -- # local app=target 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@10 -- # shift 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@12 -- # [[ -n 22 ]] 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@13 -- # [[ -z '' ]] 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@15 -- # local app_extra_params= 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@16 -- # [[ 0 -eq 1 ]] 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@22 -- # app_pid["$app"]=2591457 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@24 -- # echo 'Waiting for target to run...' 00:06:00.868 Waiting for target to run... 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@25 -- # waitforlisten 2591457 /var/tmp/spdk_tgt.sock 00:06:00.868 15:45:06 json_config_extra_key -- common/autotest_common.sh@830 -- # '[' -z 2591457 ']' 00:06:00.868 15:45:06 json_config_extra_key -- json_config/common.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x1 -s 1024 -r /var/tmp/spdk_tgt.sock --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/extra_key.json 00:06:00.868 15:45:06 json_config_extra_key -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk_tgt.sock 00:06:00.868 15:45:06 json_config_extra_key -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:00.869 15:45:06 json_config_extra_key -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock...' 00:06:00.869 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk_tgt.sock... 00:06:00.869 15:45:06 json_config_extra_key -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:00.869 15:45:06 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:00.869 [2024-06-10 15:45:06.186467] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:00.869 [2024-06-10 15:45:06.186527] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 -m 1024 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2591457 ] 00:06:01.437 [2024-06-10 15:45:06.683383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:01.437 [2024-06-10 15:45:06.791553] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:01.696 15:45:07 json_config_extra_key -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:01.696 15:45:07 json_config_extra_key -- common/autotest_common.sh@863 -- # return 0 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@26 -- # echo '' 00:06:01.696 00:06:01.696 15:45:07 json_config_extra_key -- json_config/json_config_extra_key.sh@27 -- # echo 'INFO: shutting down applications...' 00:06:01.696 INFO: shutting down applications... 00:06:01.696 15:45:07 json_config_extra_key -- json_config/json_config_extra_key.sh@28 -- # json_config_test_shutdown_app target 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@31 -- # local app=target 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@34 -- # [[ -n 22 ]] 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@35 -- # [[ -n 2591457 ]] 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@38 -- # kill -SIGINT 2591457 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i = 0 )) 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2591457 00:06:01.696 15:45:07 json_config_extra_key -- json_config/common.sh@45 -- # sleep 0.5 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i++ )) 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@40 -- # (( i < 30 )) 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@41 -- # kill -0 2591457 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@42 -- # app_pid["$app"]= 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@43 -- # break 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@48 -- # [[ -n '' ]] 00:06:02.264 15:45:07 json_config_extra_key -- json_config/common.sh@53 -- # echo 'SPDK target shutdown done' 00:06:02.264 SPDK target shutdown done 00:06:02.264 15:45:07 json_config_extra_key -- json_config/json_config_extra_key.sh@30 -- # echo Success 00:06:02.264 Success 00:06:02.264 00:06:02.264 real 0m1.614s 00:06:02.264 user 0m1.208s 00:06:02.264 sys 0m0.598s 00:06:02.264 15:45:07 json_config_extra_key -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:02.264 15:45:07 json_config_extra_key -- common/autotest_common.sh@10 -- # set +x 00:06:02.264 ************************************ 00:06:02.264 END TEST json_config_extra_key 00:06:02.264 ************************************ 00:06:02.264 15:45:07 -- spdk/autotest.sh@174 -- # run_test alias_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:02.264 15:45:07 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:02.264 15:45:07 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:02.264 15:45:07 -- common/autotest_common.sh@10 -- # set +x 00:06:02.264 ************************************ 00:06:02.264 START TEST alias_rpc 00:06:02.264 ************************************ 00:06:02.264 15:45:07 alias_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc/alias_rpc.sh 00:06:02.523 * Looking for test storage... 00:06:02.523 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/alias_rpc 00:06:02.523 15:45:07 alias_rpc -- alias_rpc/alias_rpc.sh@10 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:02.523 15:45:07 alias_rpc -- alias_rpc/alias_rpc.sh@13 -- # spdk_tgt_pid=2591731 00:06:02.523 15:45:07 alias_rpc -- alias_rpc/alias_rpc.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:02.523 15:45:07 alias_rpc -- alias_rpc/alias_rpc.sh@14 -- # waitforlisten 2591731 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@830 -- # '[' -z 2591731 ']' 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:02.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:02.523 15:45:07 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:02.523 [2024-06-10 15:45:07.854303] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:02.523 [2024-06-10 15:45:07.854345] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2591731 ] 00:06:02.523 [2024-06-10 15:45:07.941775] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:02.782 [2024-06-10 15:45:08.034868] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:03.349 15:45:08 alias_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:03.349 15:45:08 alias_rpc -- common/autotest_common.sh@863 -- # return 0 00:06:03.349 15:45:08 alias_rpc -- alias_rpc/alias_rpc.sh@17 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_config -i 00:06:03.608 15:45:09 alias_rpc -- alias_rpc/alias_rpc.sh@19 -- # killprocess 2591731 00:06:03.608 15:45:09 alias_rpc -- common/autotest_common.sh@949 -- # '[' -z 2591731 ']' 00:06:03.608 15:45:09 alias_rpc -- common/autotest_common.sh@953 -- # kill -0 2591731 00:06:03.608 15:45:09 alias_rpc -- common/autotest_common.sh@954 -- # uname 00:06:03.608 15:45:09 alias_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:03.608 15:45:09 alias_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2591731 00:06:03.867 15:45:09 alias_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:03.867 15:45:09 alias_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:03.867 15:45:09 alias_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2591731' 00:06:03.867 killing process with pid 2591731 00:06:03.867 15:45:09 alias_rpc -- common/autotest_common.sh@968 -- # kill 2591731 00:06:03.867 15:45:09 alias_rpc -- common/autotest_common.sh@973 -- # wait 2591731 00:06:04.125 00:06:04.125 real 0m1.780s 00:06:04.125 user 0m2.091s 00:06:04.125 sys 0m0.459s 00:06:04.125 15:45:09 alias_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:04.125 15:45:09 alias_rpc -- common/autotest_common.sh@10 -- # set +x 00:06:04.125 ************************************ 00:06:04.125 END TEST alias_rpc 00:06:04.125 ************************************ 00:06:04.125 15:45:09 -- spdk/autotest.sh@176 -- # [[ 0 -eq 0 ]] 00:06:04.125 15:45:09 -- spdk/autotest.sh@177 -- # run_test spdkcli_tcp /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:04.125 15:45:09 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:04.125 15:45:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:04.125 15:45:09 -- common/autotest_common.sh@10 -- # set +x 00:06:04.125 ************************************ 00:06:04.125 START TEST spdkcli_tcp 00:06:04.125 ************************************ 00:06:04.125 15:45:09 spdkcli_tcp -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/tcp.sh 00:06:04.125 * Looking for test storage... 00:06:04.385 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/common.sh 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/common.sh@6 -- # spdkcli_job=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/spdkcli/spdkcli_job.py 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/common.sh@7 -- # spdk_clear_config_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/json_config/clear_config.py 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@18 -- # IP_ADDRESS=127.0.0.1 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@19 -- # PORT=9998 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@21 -- # trap 'err_cleanup; exit 1' SIGINT SIGTERM EXIT 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@23 -- # timing_enter run_spdk_tgt_tcp 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@723 -- # xtrace_disable 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@25 -- # spdk_tgt_pid=2592418 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@27 -- # waitforlisten 2592418 00:06:04.385 15:45:09 spdkcli_tcp -- spdkcli/tcp.sh@24 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -m 0x3 -p 0 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@830 -- # '[' -z 2592418 ']' 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:04.385 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:04.385 15:45:09 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:04.385 [2024-06-10 15:45:09.713822] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:04.385 [2024-06-10 15:45:09.713882] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2592418 ] 00:06:04.385 [2024-06-10 15:45:09.814608] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:04.644 [2024-06-10 15:45:09.913453] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:04.644 [2024-06-10 15:45:09.913459] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:05.211 15:45:10 spdkcli_tcp -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:05.211 15:45:10 spdkcli_tcp -- common/autotest_common.sh@863 -- # return 0 00:06:05.211 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@31 -- # socat_pid=2592629 00:06:05.211 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -r 100 -t 2 -s 127.0.0.1 -p 9998 rpc_get_methods 00:06:05.211 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@30 -- # socat TCP-LISTEN:9998 UNIX-CONNECT:/var/tmp/spdk.sock 00:06:05.471 [ 00:06:05.471 "bdev_malloc_delete", 00:06:05.471 "bdev_malloc_create", 00:06:05.471 "bdev_null_resize", 00:06:05.471 "bdev_null_delete", 00:06:05.471 "bdev_null_create", 00:06:05.471 "bdev_nvme_cuse_unregister", 00:06:05.471 "bdev_nvme_cuse_register", 00:06:05.471 "bdev_opal_new_user", 00:06:05.471 "bdev_opal_set_lock_state", 00:06:05.471 "bdev_opal_delete", 00:06:05.471 "bdev_opal_get_info", 00:06:05.471 "bdev_opal_create", 00:06:05.471 "bdev_nvme_opal_revert", 00:06:05.471 "bdev_nvme_opal_init", 00:06:05.471 "bdev_nvme_send_cmd", 00:06:05.471 "bdev_nvme_get_path_iostat", 00:06:05.471 "bdev_nvme_get_mdns_discovery_info", 00:06:05.471 "bdev_nvme_stop_mdns_discovery", 00:06:05.471 "bdev_nvme_start_mdns_discovery", 00:06:05.471 "bdev_nvme_set_multipath_policy", 00:06:05.471 "bdev_nvme_set_preferred_path", 00:06:05.471 "bdev_nvme_get_io_paths", 00:06:05.471 "bdev_nvme_remove_error_injection", 00:06:05.471 "bdev_nvme_add_error_injection", 00:06:05.471 "bdev_nvme_get_discovery_info", 00:06:05.471 "bdev_nvme_stop_discovery", 00:06:05.471 "bdev_nvme_start_discovery", 00:06:05.471 "bdev_nvme_get_controller_health_info", 00:06:05.471 "bdev_nvme_disable_controller", 00:06:05.471 "bdev_nvme_enable_controller", 00:06:05.471 "bdev_nvme_reset_controller", 00:06:05.471 "bdev_nvme_get_transport_statistics", 00:06:05.471 "bdev_nvme_apply_firmware", 00:06:05.471 "bdev_nvme_detach_controller", 00:06:05.471 "bdev_nvme_get_controllers", 00:06:05.471 "bdev_nvme_attach_controller", 00:06:05.471 "bdev_nvme_set_hotplug", 00:06:05.471 "bdev_nvme_set_options", 00:06:05.471 "bdev_passthru_delete", 00:06:05.471 "bdev_passthru_create", 00:06:05.471 "bdev_lvol_set_parent_bdev", 00:06:05.471 "bdev_lvol_set_parent", 00:06:05.471 "bdev_lvol_check_shallow_copy", 00:06:05.471 "bdev_lvol_start_shallow_copy", 00:06:05.471 "bdev_lvol_grow_lvstore", 00:06:05.471 "bdev_lvol_get_lvols", 00:06:05.471 "bdev_lvol_get_lvstores", 00:06:05.471 "bdev_lvol_delete", 00:06:05.471 "bdev_lvol_set_read_only", 00:06:05.471 "bdev_lvol_resize", 00:06:05.471 "bdev_lvol_decouple_parent", 00:06:05.471 "bdev_lvol_inflate", 00:06:05.471 "bdev_lvol_rename", 00:06:05.471 "bdev_lvol_clone_bdev", 00:06:05.471 "bdev_lvol_clone", 00:06:05.471 "bdev_lvol_snapshot", 00:06:05.471 "bdev_lvol_create", 00:06:05.471 "bdev_lvol_delete_lvstore", 00:06:05.471 "bdev_lvol_rename_lvstore", 00:06:05.471 "bdev_lvol_create_lvstore", 00:06:05.471 "bdev_raid_set_options", 00:06:05.471 "bdev_raid_remove_base_bdev", 00:06:05.471 "bdev_raid_add_base_bdev", 00:06:05.471 "bdev_raid_delete", 00:06:05.471 "bdev_raid_create", 00:06:05.471 "bdev_raid_get_bdevs", 00:06:05.471 "bdev_error_inject_error", 00:06:05.471 "bdev_error_delete", 00:06:05.471 "bdev_error_create", 00:06:05.471 "bdev_split_delete", 00:06:05.471 "bdev_split_create", 00:06:05.471 "bdev_delay_delete", 00:06:05.471 "bdev_delay_create", 00:06:05.471 "bdev_delay_update_latency", 00:06:05.471 "bdev_zone_block_delete", 00:06:05.471 "bdev_zone_block_create", 00:06:05.471 "blobfs_create", 00:06:05.471 "blobfs_detect", 00:06:05.471 "blobfs_set_cache_size", 00:06:05.471 "bdev_crypto_delete", 00:06:05.471 "bdev_crypto_create", 00:06:05.471 "bdev_compress_delete", 00:06:05.471 "bdev_compress_create", 00:06:05.471 "bdev_compress_get_orphans", 00:06:05.471 "bdev_aio_delete", 00:06:05.471 "bdev_aio_rescan", 00:06:05.471 "bdev_aio_create", 00:06:05.471 "bdev_ftl_set_property", 00:06:05.471 "bdev_ftl_get_properties", 00:06:05.471 "bdev_ftl_get_stats", 00:06:05.471 "bdev_ftl_unmap", 00:06:05.471 "bdev_ftl_unload", 00:06:05.471 "bdev_ftl_delete", 00:06:05.471 "bdev_ftl_load", 00:06:05.471 "bdev_ftl_create", 00:06:05.471 "bdev_virtio_attach_controller", 00:06:05.471 "bdev_virtio_scsi_get_devices", 00:06:05.471 "bdev_virtio_detach_controller", 00:06:05.471 "bdev_virtio_blk_set_hotplug", 00:06:05.471 "bdev_iscsi_delete", 00:06:05.471 "bdev_iscsi_create", 00:06:05.471 "bdev_iscsi_set_options", 00:06:05.471 "accel_error_inject_error", 00:06:05.471 "ioat_scan_accel_module", 00:06:05.471 "dsa_scan_accel_module", 00:06:05.471 "iaa_scan_accel_module", 00:06:05.471 "dpdk_cryptodev_get_driver", 00:06:05.471 "dpdk_cryptodev_set_driver", 00:06:05.471 "dpdk_cryptodev_scan_accel_module", 00:06:05.471 "compressdev_scan_accel_module", 00:06:05.471 "keyring_file_remove_key", 00:06:05.471 "keyring_file_add_key", 00:06:05.471 "keyring_linux_set_options", 00:06:05.471 "iscsi_get_histogram", 00:06:05.471 "iscsi_enable_histogram", 00:06:05.471 "iscsi_set_options", 00:06:05.471 "iscsi_get_auth_groups", 00:06:05.471 "iscsi_auth_group_remove_secret", 00:06:05.471 "iscsi_auth_group_add_secret", 00:06:05.471 "iscsi_delete_auth_group", 00:06:05.471 "iscsi_create_auth_group", 00:06:05.471 "iscsi_set_discovery_auth", 00:06:05.471 "iscsi_get_options", 00:06:05.471 "iscsi_target_node_request_logout", 00:06:05.471 "iscsi_target_node_set_redirect", 00:06:05.471 "iscsi_target_node_set_auth", 00:06:05.471 "iscsi_target_node_add_lun", 00:06:05.471 "iscsi_get_stats", 00:06:05.471 "iscsi_get_connections", 00:06:05.471 "iscsi_portal_group_set_auth", 00:06:05.471 "iscsi_start_portal_group", 00:06:05.471 "iscsi_delete_portal_group", 00:06:05.471 "iscsi_create_portal_group", 00:06:05.471 "iscsi_get_portal_groups", 00:06:05.471 "iscsi_delete_target_node", 00:06:05.471 "iscsi_target_node_remove_pg_ig_maps", 00:06:05.471 "iscsi_target_node_add_pg_ig_maps", 00:06:05.471 "iscsi_create_target_node", 00:06:05.471 "iscsi_get_target_nodes", 00:06:05.471 "iscsi_delete_initiator_group", 00:06:05.471 "iscsi_initiator_group_remove_initiators", 00:06:05.471 "iscsi_initiator_group_add_initiators", 00:06:05.471 "iscsi_create_initiator_group", 00:06:05.471 "iscsi_get_initiator_groups", 00:06:05.471 "nvmf_set_crdt", 00:06:05.471 "nvmf_set_config", 00:06:05.471 "nvmf_set_max_subsystems", 00:06:05.471 "nvmf_stop_mdns_prr", 00:06:05.471 "nvmf_publish_mdns_prr", 00:06:05.471 "nvmf_subsystem_get_listeners", 00:06:05.471 "nvmf_subsystem_get_qpairs", 00:06:05.471 "nvmf_subsystem_get_controllers", 00:06:05.471 "nvmf_get_stats", 00:06:05.471 "nvmf_get_transports", 00:06:05.471 "nvmf_create_transport", 00:06:05.471 "nvmf_get_targets", 00:06:05.471 "nvmf_delete_target", 00:06:05.471 "nvmf_create_target", 00:06:05.471 "nvmf_subsystem_allow_any_host", 00:06:05.471 "nvmf_subsystem_remove_host", 00:06:05.471 "nvmf_subsystem_add_host", 00:06:05.471 "nvmf_ns_remove_host", 00:06:05.471 "nvmf_ns_add_host", 00:06:05.471 "nvmf_subsystem_remove_ns", 00:06:05.471 "nvmf_subsystem_add_ns", 00:06:05.471 "nvmf_subsystem_listener_set_ana_state", 00:06:05.471 "nvmf_discovery_get_referrals", 00:06:05.471 "nvmf_discovery_remove_referral", 00:06:05.471 "nvmf_discovery_add_referral", 00:06:05.471 "nvmf_subsystem_remove_listener", 00:06:05.471 "nvmf_subsystem_add_listener", 00:06:05.471 "nvmf_delete_subsystem", 00:06:05.471 "nvmf_create_subsystem", 00:06:05.471 "nvmf_get_subsystems", 00:06:05.471 "env_dpdk_get_mem_stats", 00:06:05.471 "nbd_get_disks", 00:06:05.471 "nbd_stop_disk", 00:06:05.471 "nbd_start_disk", 00:06:05.471 "ublk_recover_disk", 00:06:05.471 "ublk_get_disks", 00:06:05.471 "ublk_stop_disk", 00:06:05.471 "ublk_start_disk", 00:06:05.471 "ublk_destroy_target", 00:06:05.471 "ublk_create_target", 00:06:05.471 "virtio_blk_create_transport", 00:06:05.471 "virtio_blk_get_transports", 00:06:05.471 "vhost_controller_set_coalescing", 00:06:05.471 "vhost_get_controllers", 00:06:05.471 "vhost_delete_controller", 00:06:05.471 "vhost_create_blk_controller", 00:06:05.471 "vhost_scsi_controller_remove_target", 00:06:05.471 "vhost_scsi_controller_add_target", 00:06:05.471 "vhost_start_scsi_controller", 00:06:05.471 "vhost_create_scsi_controller", 00:06:05.471 "thread_set_cpumask", 00:06:05.471 "framework_get_scheduler", 00:06:05.471 "framework_set_scheduler", 00:06:05.471 "framework_get_reactors", 00:06:05.471 "thread_get_io_channels", 00:06:05.471 "thread_get_pollers", 00:06:05.471 "thread_get_stats", 00:06:05.471 "framework_monitor_context_switch", 00:06:05.471 "spdk_kill_instance", 00:06:05.471 "log_enable_timestamps", 00:06:05.471 "log_get_flags", 00:06:05.471 "log_clear_flag", 00:06:05.471 "log_set_flag", 00:06:05.471 "log_get_level", 00:06:05.471 "log_set_level", 00:06:05.471 "log_get_print_level", 00:06:05.471 "log_set_print_level", 00:06:05.471 "framework_enable_cpumask_locks", 00:06:05.471 "framework_disable_cpumask_locks", 00:06:05.471 "framework_wait_init", 00:06:05.472 "framework_start_init", 00:06:05.472 "scsi_get_devices", 00:06:05.472 "bdev_get_histogram", 00:06:05.472 "bdev_enable_histogram", 00:06:05.472 "bdev_set_qos_limit", 00:06:05.472 "bdev_set_qd_sampling_period", 00:06:05.472 "bdev_get_bdevs", 00:06:05.472 "bdev_reset_iostat", 00:06:05.472 "bdev_get_iostat", 00:06:05.472 "bdev_examine", 00:06:05.472 "bdev_wait_for_examine", 00:06:05.472 "bdev_set_options", 00:06:05.472 "notify_get_notifications", 00:06:05.472 "notify_get_types", 00:06:05.472 "accel_get_stats", 00:06:05.472 "accel_set_options", 00:06:05.472 "accel_set_driver", 00:06:05.472 "accel_crypto_key_destroy", 00:06:05.472 "accel_crypto_keys_get", 00:06:05.472 "accel_crypto_key_create", 00:06:05.472 "accel_assign_opc", 00:06:05.472 "accel_get_module_info", 00:06:05.472 "accel_get_opc_assignments", 00:06:05.472 "vmd_rescan", 00:06:05.472 "vmd_remove_device", 00:06:05.472 "vmd_enable", 00:06:05.472 "sock_get_default_impl", 00:06:05.472 "sock_set_default_impl", 00:06:05.472 "sock_impl_set_options", 00:06:05.472 "sock_impl_get_options", 00:06:05.472 "iobuf_get_stats", 00:06:05.472 "iobuf_set_options", 00:06:05.472 "framework_get_pci_devices", 00:06:05.472 "framework_get_config", 00:06:05.472 "framework_get_subsystems", 00:06:05.472 "trace_get_info", 00:06:05.472 "trace_get_tpoint_group_mask", 00:06:05.472 "trace_disable_tpoint_group", 00:06:05.472 "trace_enable_tpoint_group", 00:06:05.472 "trace_clear_tpoint_mask", 00:06:05.472 "trace_set_tpoint_mask", 00:06:05.472 "keyring_get_keys", 00:06:05.472 "spdk_get_version", 00:06:05.472 "rpc_get_methods" 00:06:05.472 ] 00:06:05.472 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@35 -- # timing_exit run_spdk_tgt_tcp 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@729 -- # xtrace_disable 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:05.472 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@37 -- # trap - SIGINT SIGTERM EXIT 00:06:05.472 15:45:10 spdkcli_tcp -- spdkcli/tcp.sh@38 -- # killprocess 2592418 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@949 -- # '[' -z 2592418 ']' 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@953 -- # kill -0 2592418 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@954 -- # uname 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:05.472 15:45:10 spdkcli_tcp -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2592418 00:06:05.731 15:45:11 spdkcli_tcp -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:05.731 15:45:11 spdkcli_tcp -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:05.731 15:45:11 spdkcli_tcp -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2592418' 00:06:05.731 killing process with pid 2592418 00:06:05.731 15:45:11 spdkcli_tcp -- common/autotest_common.sh@968 -- # kill 2592418 00:06:05.731 15:45:11 spdkcli_tcp -- common/autotest_common.sh@973 -- # wait 2592418 00:06:05.990 00:06:05.990 real 0m1.798s 00:06:05.990 user 0m3.440s 00:06:05.990 sys 0m0.506s 00:06:05.990 15:45:11 spdkcli_tcp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:05.990 15:45:11 spdkcli_tcp -- common/autotest_common.sh@10 -- # set +x 00:06:05.990 ************************************ 00:06:05.990 END TEST spdkcli_tcp 00:06:05.990 ************************************ 00:06:05.990 15:45:11 -- spdk/autotest.sh@180 -- # run_test dpdk_mem_utility /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:05.990 15:45:11 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:05.990 15:45:11 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:05.990 15:45:11 -- common/autotest_common.sh@10 -- # set +x 00:06:05.990 ************************************ 00:06:05.990 START TEST dpdk_mem_utility 00:06:05.990 ************************************ 00:06:05.990 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility/test_dpdk_mem_info.sh 00:06:05.990 * Looking for test storage... 00:06:06.250 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/dpdk_memory_utility 00:06:06.250 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@10 -- # MEM_SCRIPT=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:06.250 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@13 -- # spdkpid=2592923 00:06:06.250 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@15 -- # waitforlisten 2592923 00:06:06.250 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@830 -- # '[' -z 2592923 ']' 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:06.250 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:06.250 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:06.250 [2024-06-10 15:45:11.560027] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:06.250 [2024-06-10 15:45:11.560070] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2592923 ] 00:06:06.250 [2024-06-10 15:45:11.645898] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:06.250 [2024-06-10 15:45:11.738246] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:06.509 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:06.509 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@863 -- # return 0 00:06:06.509 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@17 -- # trap 'killprocess $spdkpid' SIGINT SIGTERM EXIT 00:06:06.509 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@19 -- # rpc_cmd env_dpdk_get_mem_stats 00:06:06.509 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:06.509 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:06.509 { 00:06:06.509 "filename": "/tmp/spdk_mem_dump.txt" 00:06:06.509 } 00:06:06.509 15:45:11 dpdk_mem_utility -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:06.509 15:45:11 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@21 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py 00:06:06.774 DPDK memory size 814.000000 MiB in 1 heap(s) 00:06:06.774 1 heaps totaling size 814.000000 MiB 00:06:06.774 size: 814.000000 MiB heap id: 0 00:06:06.774 end heaps---------- 00:06:06.774 8 mempools totaling size 598.116089 MiB 00:06:06.774 size: 212.674988 MiB name: PDU_immediate_data_Pool 00:06:06.774 size: 158.602051 MiB name: PDU_data_out_Pool 00:06:06.774 size: 84.521057 MiB name: bdev_io_2592923 00:06:06.775 size: 51.011292 MiB name: evtpool_2592923 00:06:06.775 size: 50.003479 MiB name: msgpool_2592923 00:06:06.775 size: 21.763794 MiB name: PDU_Pool 00:06:06.775 size: 19.513306 MiB name: SCSI_TASK_Pool 00:06:06.775 size: 0.026123 MiB name: Session_Pool 00:06:06.775 end mempools------- 00:06:06.775 201 memzones totaling size 4.176453 MiB 00:06:06.775 size: 1.000366 MiB name: RG_ring_0_2592923 00:06:06.775 size: 1.000366 MiB name: RG_ring_1_2592923 00:06:06.775 size: 1.000366 MiB name: RG_ring_4_2592923 00:06:06.775 size: 1.000366 MiB name: RG_ring_5_2592923 00:06:06.775 size: 0.125366 MiB name: RG_ring_2_2592923 00:06:06.775 size: 0.015991 MiB name: RG_ring_3_2592923 00:06:06.775 size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:06.775 size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:06.775 size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_0 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_1 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_2 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_3 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_4 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_5 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_6 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_7 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_8 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_9 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_10 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_11 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_12 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_13 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_14 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_15 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_16 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_17 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_18 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_19 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_20 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_21 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_22 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_23 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_24 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_25 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_26 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_27 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_28 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_29 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_30 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_31 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_32 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_33 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_34 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_35 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:06.775 size: 0.000122 MiB name: rte_compressdev_data_36 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:06.775 size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_37 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_38 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_39 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_40 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_41 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_42 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_43 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_44 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_45 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_46 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:06.776 size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:06.776 size: 0.000122 MiB name: rte_compressdev_data_47 00:06:06.776 size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:06.776 end memzones------- 00:06:06.776 15:45:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/dpdk_mem_info.py -m 0 00:06:06.776 heap id: 0 total size: 814.000000 MiB number of busy elements: 644 number of free elements: 14 00:06:06.776 list of free elements. size: 11.780090 MiB 00:06:06.776 element at address: 0x200000400000 with size: 1.999512 MiB 00:06:06.776 element at address: 0x200018e00000 with size: 0.999878 MiB 00:06:06.776 element at address: 0x200019000000 with size: 0.999878 MiB 00:06:06.776 element at address: 0x200003e00000 with size: 0.996460 MiB 00:06:06.776 element at address: 0x200031c00000 with size: 0.994446 MiB 00:06:06.776 element at address: 0x200013800000 with size: 0.978699 MiB 00:06:06.776 element at address: 0x200007000000 with size: 0.959839 MiB 00:06:06.776 element at address: 0x200019200000 with size: 0.936584 MiB 00:06:06.776 element at address: 0x20001aa00000 with size: 0.564026 MiB 00:06:06.776 element at address: 0x200003a00000 with size: 0.494141 MiB 00:06:06.776 element at address: 0x20000b200000 with size: 0.489075 MiB 00:06:06.776 element at address: 0x200000800000 with size: 0.486145 MiB 00:06:06.776 element at address: 0x200019400000 with size: 0.485657 MiB 00:06:06.776 element at address: 0x200027e00000 with size: 0.395752 MiB 00:06:06.776 list of standard malloc elements. size: 199.899902 MiB 00:06:06.776 element at address: 0x20000b3fff80 with size: 132.000122 MiB 00:06:06.776 element at address: 0x2000071fff80 with size: 64.000122 MiB 00:06:06.776 element at address: 0x200018efff80 with size: 1.000122 MiB 00:06:06.776 element at address: 0x2000190fff80 with size: 1.000122 MiB 00:06:06.776 element at address: 0x2000192fff80 with size: 1.000122 MiB 00:06:06.776 element at address: 0x2000003d9f00 with size: 0.140747 MiB 00:06:06.776 element at address: 0x2000192eff00 with size: 0.062622 MiB 00:06:06.776 element at address: 0x2000003fdf80 with size: 0.007935 MiB 00:06:06.776 element at address: 0x20000032bc80 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000032f740 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000333200 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000336cc0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000033a780 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000033e240 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000341d00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003457c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000349280 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000034cd40 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000350800 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003542c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000357d80 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000035b840 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000035f300 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000362dc0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000366880 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000036a340 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000036de00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003718c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000375380 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000378e40 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000037c900 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003803c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000383e80 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000387940 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000038b400 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000038eec0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000392980 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000396440 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000399f00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x20000039d9c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003a1480 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003a4f40 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003a8a00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003ac4c0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003aff80 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003b3a40 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003b7500 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003bafc0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003bea80 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003c2540 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003c6000 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003c9ac0 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003cd580 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003d1040 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003d4b00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x2000003d8d00 with size: 0.004395 MiB 00:06:06.776 element at address: 0x200000329b80 with size: 0.004028 MiB 00:06:06.776 element at address: 0x20000032ac00 with size: 0.004028 MiB 00:06:06.776 element at address: 0x20000032d640 with size: 0.004028 MiB 00:06:06.776 element at address: 0x20000032e6c0 with size: 0.004028 MiB 00:06:06.776 element at address: 0x200000331100 with size: 0.004028 MiB 00:06:06.776 element at address: 0x200000332180 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000334bc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000335c40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000338680 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000339700 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000033c140 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000033d1c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000033fc00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000340c80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003436c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000344740 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000347180 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000348200 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000034ac40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000034bcc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000034e700 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000034f780 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003521c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000353240 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000355c80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000356d00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000359740 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000035a7c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000035d200 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000035e280 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000360cc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000361d40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000364780 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000365800 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000368240 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003692c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000036bd00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000036cd80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000036f7c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000370840 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000373280 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000374300 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000376d40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000377dc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000037a800 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000037b880 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000037e2c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000037f340 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000381d80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000382e00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000385840 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003868c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000389300 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000038a380 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000038cdc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000038de40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000390880 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000391900 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000394340 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003953c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000397e00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000398e80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000039b8c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000039c940 with size: 0.004028 MiB 00:06:06.777 element at address: 0x20000039f380 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003a0400 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003a2e40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003a3ec0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003a6900 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003a7980 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003aa3c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003ab440 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003ade80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003aef00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b1940 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b29c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b5400 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b6480 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b8ec0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003b9f40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003bc980 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003bda00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c0440 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c14c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c3f00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c4f80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c79c0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003c8a40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003cb480 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003cc500 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003cef40 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003cffc0 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003d2a00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003d3a80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003d6c00 with size: 0.004028 MiB 00:06:06.777 element at address: 0x2000003d7c80 with size: 0.004028 MiB 00:06:06.777 element at address: 0x200000200000 with size: 0.000305 MiB 00:06:06.777 element at address: 0x2000192efdc0 with size: 0.000305 MiB 00:06:06.777 element at address: 0x200000200140 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200200 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002002c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200380 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200440 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200500 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002005c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200680 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200740 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200800 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002008c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200980 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200a40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200b00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200bc0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200c80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200d40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000200e00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000201000 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002052c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225580 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225640 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225700 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002257c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225880 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225940 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225a00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225ac0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225b80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225c40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225d00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225dc0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225e80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000225f40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226000 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002260c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226180 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226240 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226300 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226500 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002265c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226680 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226740 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226800 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000002268c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226980 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226a40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226b00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226bc0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226c80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226d40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226e00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226ec0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000226f80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000227040 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000227100 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000329300 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000003293c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000329580 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000329640 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000329800 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000032ce80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000032d040 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000032d100 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000032d2c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000330940 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000330b00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000330bc0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000330d80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000334400 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000003345c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000334680 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000334840 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000337ec0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000338080 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000338140 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000338300 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033b980 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033bb40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033bc00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033bdc0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033f440 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033f600 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033f6c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000033f880 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000342f00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000003430c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000343180 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000343340 with size: 0.000183 MiB 00:06:06.777 element at address: 0x2000003469c0 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000346b80 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000346c40 with size: 0.000183 MiB 00:06:06.777 element at address: 0x200000346e00 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000034a480 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000034a640 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000034a700 with size: 0.000183 MiB 00:06:06.777 element at address: 0x20000034a8c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000034df40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000034e100 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000034e1c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000034e380 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000351a00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000351bc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000351c80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000351e40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003554c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000355680 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000355740 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000355900 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000358f80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000359140 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000359200 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003593c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000035ca40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000035cc00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000035ccc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000035ce80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000360500 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003606c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000360780 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000360940 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000363fc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000364180 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000364240 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000364400 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000367a80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000367c40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000367d00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000367ec0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036b540 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036b700 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036b7c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036b980 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036f000 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036f1c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036f280 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000036f440 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000372ac0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000372c80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000372d40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000372f00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000376580 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000376740 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000376800 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003769c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037a040 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037a200 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037a2c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037a480 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037db00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037dcc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037dd80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000037df40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003815c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000381780 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000381840 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000381a00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000385080 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000385240 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000385300 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003854c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000388b40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000388d00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000388dc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000388f80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000038c600 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000038c7c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000038c880 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000038ca40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003900c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000390280 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000390340 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000390500 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000393b80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000393d40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000393e00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000393fc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000397640 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000397800 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003978c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200000397a80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039b100 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039b2c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039b380 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039b540 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039ebc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039ed80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039ee40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000039f000 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a2680 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a2840 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a2900 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a2ac0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a6140 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a6300 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a63c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a6580 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a9c00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a9dc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003a9e80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003aa040 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003ad6c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003ad880 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003ad940 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003adb00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b1180 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b1340 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b1400 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b15c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b4c40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b4e00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b4ec0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b5080 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b8700 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b88c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b8980 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003b8b40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bc1c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bc380 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bc440 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bc600 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bfc80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bfe40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003bff00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c00c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c3740 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c3900 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c39c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c3b80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c7200 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c73c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c7480 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003c7640 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003cacc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003cae80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003caf40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003cb100 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003ce780 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003ce940 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003cea00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003cebc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d2240 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d2400 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d24c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d2680 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d5dc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d64c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d6580 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000003d6880 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087c740 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087c800 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087c8c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087c980 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087ca40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087cb00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087cbc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087cc80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087cd40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087ce00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000087cec0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000008fd180 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7e800 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7e8c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7e980 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ea40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7eb00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ebc0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ec80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ed40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ee00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7eec0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7ef80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f040 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f100 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f1c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f280 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f340 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f400 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f4c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f580 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f640 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f700 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f7c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f880 with size: 0.000183 MiB 00:06:06.778 element at address: 0x200003a7f940 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000070fdd80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d340 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d400 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d4c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d580 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d640 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d700 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d7c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d880 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27d940 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27da00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b27dac0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20000b2fdd80 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000138fa8c0 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000192efc40 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000192efd00 with size: 0.000183 MiB 00:06:06.778 element at address: 0x2000194bc740 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20001aa90640 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20001aa90700 with size: 0.000183 MiB 00:06:06.778 element at address: 0x20001aa907c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90880 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90940 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90a00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90ac0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90b80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90c40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90d00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90dc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90e80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa90f40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91000 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa910c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91180 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91240 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91300 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa913c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91480 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91540 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91600 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa916c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91780 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91840 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91900 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa919c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91a80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91b40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91c00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91cc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91d80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91e40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91f00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa91fc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92080 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92140 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92200 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa922c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92380 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92440 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92500 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa925c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92680 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92740 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92800 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa928c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92980 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92a40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92b00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92bc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92c80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92d40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92e00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92ec0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa92f80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93040 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93100 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa931c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93280 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93340 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93400 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa934c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93580 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93640 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93700 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa937c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93880 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93940 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93a00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93ac0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93b80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93c40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93d00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93dc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93e80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa93f40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94000 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa940c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94180 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94240 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94300 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa943c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94480 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94540 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94600 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa946c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94780 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94840 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94900 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa949c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94a80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94b40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94c00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94cc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94d80 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94e40 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94f00 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa94fc0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa95080 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa95140 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa95200 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa952c0 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa95380 with size: 0.000183 MiB 00:06:06.779 element at address: 0x20001aa95440 with size: 0.000183 MiB 00:06:06.779 element at address: 0x200027e65500 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e655c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c1c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c3c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c480 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c540 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c600 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c6c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c780 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c840 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c900 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6c9c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ca80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6cb40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6cc00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ccc0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6cd80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ce40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6cf00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6cfc0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d080 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d140 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d200 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d2c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d380 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d440 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d500 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d5c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d680 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d740 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d800 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d8c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6d980 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6da40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6db00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6dbc0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6dc80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6dd40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6de00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6dec0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6df80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e040 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e100 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e1c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e280 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e340 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e400 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e4c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e580 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e640 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e700 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e7c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e880 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6e940 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ea00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6eac0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6eb80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ec40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ed00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6edc0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ee80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ef40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f000 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f0c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f180 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f240 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f300 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f3c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f480 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f540 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f600 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f6c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f780 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f840 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f900 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6f9c0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fa80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fb40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fc00 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fcc0 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fd80 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6fe40 with size: 0.000183 MiB 00:06:06.780 element at address: 0x200027e6ff00 with size: 0.000183 MiB 00:06:06.780 list of memzone associated elements. size: 602.320007 MiB 00:06:06.780 element at address: 0x20001aa95500 with size: 211.416748 MiB 00:06:06.780 associated memzone info: size: 211.416626 MiB name: MP_PDU_immediate_data_Pool_0 00:06:06.780 element at address: 0x200027e6ffc0 with size: 157.562561 MiB 00:06:06.780 associated memzone info: size: 157.562439 MiB name: MP_PDU_data_out_Pool_0 00:06:06.780 element at address: 0x2000139fab80 with size: 84.020630 MiB 00:06:06.780 associated memzone info: size: 84.020508 MiB name: MP_bdev_io_2592923_0 00:06:06.780 element at address: 0x2000009ff380 with size: 48.003052 MiB 00:06:06.780 associated memzone info: size: 48.002930 MiB name: MP_evtpool_2592923_0 00:06:06.780 element at address: 0x200003fff380 with size: 48.003052 MiB 00:06:06.780 associated memzone info: size: 48.002930 MiB name: MP_msgpool_2592923_0 00:06:06.780 element at address: 0x2000195be940 with size: 20.255554 MiB 00:06:06.780 associated memzone info: size: 20.255432 MiB name: MP_PDU_Pool_0 00:06:06.780 element at address: 0x200031dfeb40 with size: 18.005066 MiB 00:06:06.780 associated memzone info: size: 18.004944 MiB name: MP_SCSI_TASK_Pool_0 00:06:06.780 element at address: 0x2000005ffe00 with size: 2.000488 MiB 00:06:06.780 associated memzone info: size: 2.000366 MiB name: RG_MP_evtpool_2592923 00:06:06.780 element at address: 0x200003bffe00 with size: 2.000488 MiB 00:06:06.780 associated memzone info: size: 2.000366 MiB name: RG_MP_msgpool_2592923 00:06:06.781 element at address: 0x2000002271c0 with size: 1.008118 MiB 00:06:06.781 associated memzone info: size: 1.007996 MiB name: MP_evtpool_2592923 00:06:06.781 element at address: 0x20000b2fde40 with size: 1.008118 MiB 00:06:06.781 associated memzone info: size: 1.007996 MiB name: MP_PDU_Pool 00:06:06.781 element at address: 0x2000194bc800 with size: 1.008118 MiB 00:06:06.781 associated memzone info: size: 1.007996 MiB name: MP_PDU_immediate_data_Pool 00:06:06.781 element at address: 0x2000070fde40 with size: 1.008118 MiB 00:06:06.781 associated memzone info: size: 1.007996 MiB name: MP_PDU_data_out_Pool 00:06:06.781 element at address: 0x2000008fd240 with size: 1.008118 MiB 00:06:06.781 associated memzone info: size: 1.007996 MiB name: MP_SCSI_TASK_Pool 00:06:06.781 element at address: 0x200003eff180 with size: 1.000488 MiB 00:06:06.781 associated memzone info: size: 1.000366 MiB name: RG_ring_0_2592923 00:06:06.781 element at address: 0x200003affc00 with size: 1.000488 MiB 00:06:06.781 associated memzone info: size: 1.000366 MiB name: RG_ring_1_2592923 00:06:06.781 element at address: 0x2000138fa980 with size: 1.000488 MiB 00:06:06.781 associated memzone info: size: 1.000366 MiB name: RG_ring_4_2592923 00:06:06.781 element at address: 0x200031cfe940 with size: 1.000488 MiB 00:06:06.781 associated memzone info: size: 1.000366 MiB name: RG_ring_5_2592923 00:06:06.781 element at address: 0x200003a7fa00 with size: 0.500488 MiB 00:06:06.781 associated memzone info: size: 0.500366 MiB name: RG_MP_bdev_io_2592923 00:06:06.781 element at address: 0x20000b27db80 with size: 0.500488 MiB 00:06:06.781 associated memzone info: size: 0.500366 MiB name: RG_MP_PDU_Pool 00:06:06.781 element at address: 0x20000087cf80 with size: 0.500488 MiB 00:06:06.781 associated memzone info: size: 0.500366 MiB name: RG_MP_SCSI_TASK_Pool 00:06:06.781 element at address: 0x20001947c540 with size: 0.250488 MiB 00:06:06.781 associated memzone info: size: 0.250366 MiB name: RG_MP_PDU_immediate_data_Pool 00:06:06.781 element at address: 0x200000205380 with size: 0.125488 MiB 00:06:06.781 associated memzone info: size: 0.125366 MiB name: RG_ring_2_2592923 00:06:06.781 element at address: 0x2000070f5b80 with size: 0.031738 MiB 00:06:06.781 associated memzone info: size: 0.031616 MiB name: RG_MP_PDU_data_out_Pool 00:06:06.781 element at address: 0x200027e65680 with size: 0.023743 MiB 00:06:06.781 associated memzone info: size: 0.023621 MiB name: MP_Session_Pool_0 00:06:06.781 element at address: 0x2000002010c0 with size: 0.016113 MiB 00:06:06.781 associated memzone info: size: 0.015991 MiB name: RG_ring_3_2592923 00:06:06.781 element at address: 0x200027e6b7c0 with size: 0.002441 MiB 00:06:06.781 associated memzone info: size: 0.002319 MiB name: RG_MP_Session_Pool 00:06:06.781 element at address: 0x2000003d5f80 with size: 0.001282 MiB 00:06:06.781 associated memzone info: size: 0.001160 MiB name: QAT_SYM_CAPA_GEN_1 00:06:06.781 element at address: 0x2000003d6a40 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.0_qat 00:06:06.781 element at address: 0x2000003d2840 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.1_qat 00:06:06.781 element at address: 0x2000003ced80 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.2_qat 00:06:06.781 element at address: 0x2000003cb2c0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.3_qat 00:06:06.781 element at address: 0x2000003c7800 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.4_qat 00:06:06.781 element at address: 0x2000003c3d40 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.5_qat 00:06:06.781 element at address: 0x2000003c0280 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.6_qat 00:06:06.781 element at address: 0x2000003bc7c0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:01.7_qat 00:06:06.781 element at address: 0x2000003b8d00 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.0_qat 00:06:06.781 element at address: 0x2000003b5240 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.1_qat 00:06:06.781 element at address: 0x2000003b1780 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.2_qat 00:06:06.781 element at address: 0x2000003adcc0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.3_qat 00:06:06.781 element at address: 0x2000003aa200 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.4_qat 00:06:06.781 element at address: 0x2000003a6740 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.5_qat 00:06:06.781 element at address: 0x2000003a2c80 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.6_qat 00:06:06.781 element at address: 0x20000039f1c0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1a:02.7_qat 00:06:06.781 element at address: 0x20000039b700 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.0_qat 00:06:06.781 element at address: 0x200000397c40 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.1_qat 00:06:06.781 element at address: 0x200000394180 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.2_qat 00:06:06.781 element at address: 0x2000003906c0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.3_qat 00:06:06.781 element at address: 0x20000038cc00 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.4_qat 00:06:06.781 element at address: 0x200000389140 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.5_qat 00:06:06.781 element at address: 0x200000385680 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.6_qat 00:06:06.781 element at address: 0x200000381bc0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:01.7_qat 00:06:06.781 element at address: 0x20000037e100 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.0_qat 00:06:06.781 element at address: 0x20000037a640 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.1_qat 00:06:06.781 element at address: 0x200000376b80 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.2_qat 00:06:06.781 element at address: 0x2000003730c0 with size: 0.000427 MiB 00:06:06.781 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.3_qat 00:06:06.782 element at address: 0x20000036f600 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.4_qat 00:06:06.782 element at address: 0x20000036bb40 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.5_qat 00:06:06.782 element at address: 0x200000368080 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.6_qat 00:06:06.782 element at address: 0x2000003645c0 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1c:02.7_qat 00:06:06.782 element at address: 0x200000360b00 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.0_qat 00:06:06.782 element at address: 0x20000035d040 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.1_qat 00:06:06.782 element at address: 0x200000359580 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.2_qat 00:06:06.782 element at address: 0x200000355ac0 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.3_qat 00:06:06.782 element at address: 0x200000352000 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.4_qat 00:06:06.782 element at address: 0x20000034e540 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.5_qat 00:06:06.782 element at address: 0x20000034aa80 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.6_qat 00:06:06.782 element at address: 0x200000346fc0 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:01.7_qat 00:06:06.782 element at address: 0x200000343500 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.0_qat 00:06:06.782 element at address: 0x20000033fa40 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.1_qat 00:06:06.782 element at address: 0x20000033bf80 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.2_qat 00:06:06.782 element at address: 0x2000003384c0 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.3_qat 00:06:06.782 element at address: 0x200000334a00 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.4_qat 00:06:06.782 element at address: 0x200000330f40 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.5_qat 00:06:06.782 element at address: 0x20000032d480 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.6_qat 00:06:06.782 element at address: 0x2000003299c0 with size: 0.000427 MiB 00:06:06.782 associated memzone info: size: 0.000305 MiB name: 0000:1e:02.7_qat 00:06:06.782 element at address: 0x2000003d6740 with size: 0.000305 MiB 00:06:06.782 associated memzone info: size: 0.000183 MiB name: QAT_ASYM_CAPA_GEN_1 00:06:06.782 element at address: 0x2000002263c0 with size: 0.000305 MiB 00:06:06.782 associated memzone info: size: 0.000183 MiB name: MP_msgpool_2592923 00:06:06.782 element at address: 0x200000200ec0 with size: 0.000305 MiB 00:06:06.782 associated memzone info: size: 0.000183 MiB name: MP_bdev_io_2592923 00:06:06.782 element at address: 0x200027e6c280 with size: 0.000305 MiB 00:06:06.782 associated memzone info: size: 0.000183 MiB name: MP_Session_Pool 00:06:06.782 element at address: 0x2000003d6940 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_0 00:06:06.782 element at address: 0x2000003d6640 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_1 00:06:06.782 element at address: 0x2000003d5e80 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_0 00:06:06.782 element at address: 0x2000003d2740 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_2 00:06:06.782 element at address: 0x2000003d2580 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_3 00:06:06.782 element at address: 0x2000003d2300 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_1 00:06:06.782 element at address: 0x2000003cec80 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_4 00:06:06.782 element at address: 0x2000003ceac0 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_5 00:06:06.782 element at address: 0x2000003ce840 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_2 00:06:06.782 element at address: 0x2000003cb1c0 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_6 00:06:06.782 element at address: 0x2000003cb000 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_7 00:06:06.782 element at address: 0x2000003cad80 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_3 00:06:06.782 element at address: 0x2000003c7700 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_8 00:06:06.782 element at address: 0x2000003c7540 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_9 00:06:06.782 element at address: 0x2000003c72c0 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_4 00:06:06.782 element at address: 0x2000003c3c40 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_10 00:06:06.782 element at address: 0x2000003c3a80 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_11 00:06:06.782 element at address: 0x2000003c3800 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_5 00:06:06.782 element at address: 0x2000003c0180 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_12 00:06:06.782 element at address: 0x2000003bffc0 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_13 00:06:06.782 element at address: 0x2000003bfd40 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_6 00:06:06.782 element at address: 0x2000003bc6c0 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_14 00:06:06.782 element at address: 0x2000003bc500 with size: 0.000244 MiB 00:06:06.782 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_15 00:06:06.783 element at address: 0x2000003bc280 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_7 00:06:06.783 element at address: 0x2000003b8c00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_16 00:06:06.783 element at address: 0x2000003b8a40 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_17 00:06:06.783 element at address: 0x2000003b87c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_8 00:06:06.783 element at address: 0x2000003b5140 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_18 00:06:06.783 element at address: 0x2000003b4f80 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_19 00:06:06.783 element at address: 0x2000003b4d00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_9 00:06:06.783 element at address: 0x2000003b1680 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_20 00:06:06.783 element at address: 0x2000003b14c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_21 00:06:06.783 element at address: 0x2000003b1240 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_10 00:06:06.783 element at address: 0x2000003adbc0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_22 00:06:06.783 element at address: 0x2000003ada00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_23 00:06:06.783 element at address: 0x2000003ad780 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_11 00:06:06.783 element at address: 0x2000003aa100 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_24 00:06:06.783 element at address: 0x2000003a9f40 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_25 00:06:06.783 element at address: 0x2000003a9cc0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_12 00:06:06.783 element at address: 0x2000003a6640 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_26 00:06:06.783 element at address: 0x2000003a6480 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_27 00:06:06.783 element at address: 0x2000003a6200 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_13 00:06:06.783 element at address: 0x2000003a2b80 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_28 00:06:06.783 element at address: 0x2000003a29c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_29 00:06:06.783 element at address: 0x2000003a2740 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_14 00:06:06.783 element at address: 0x20000039f0c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_30 00:06:06.783 element at address: 0x20000039ef00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_31 00:06:06.783 element at address: 0x20000039ec80 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_15 00:06:06.783 element at address: 0x20000039b600 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_32 00:06:06.783 element at address: 0x20000039b440 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_33 00:06:06.783 element at address: 0x20000039b1c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_16 00:06:06.783 element at address: 0x200000397b40 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_34 00:06:06.783 element at address: 0x200000397980 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_35 00:06:06.783 element at address: 0x200000397700 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_17 00:06:06.783 element at address: 0x200000394080 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_36 00:06:06.783 element at address: 0x200000393ec0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_37 00:06:06.783 element at address: 0x200000393c40 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_18 00:06:06.783 element at address: 0x2000003905c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_38 00:06:06.783 element at address: 0x200000390400 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_39 00:06:06.783 element at address: 0x200000390180 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_19 00:06:06.783 element at address: 0x20000038cb00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_40 00:06:06.783 element at address: 0x20000038c940 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_41 00:06:06.783 element at address: 0x20000038c6c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_20 00:06:06.783 element at address: 0x200000389040 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_42 00:06:06.783 element at address: 0x200000388e80 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_43 00:06:06.783 element at address: 0x200000388c00 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_21 00:06:06.783 element at address: 0x200000385580 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_44 00:06:06.783 element at address: 0x2000003853c0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_45 00:06:06.783 element at address: 0x200000385140 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_22 00:06:06.783 element at address: 0x200000381ac0 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_46 00:06:06.783 element at address: 0x200000381900 with size: 0.000244 MiB 00:06:06.783 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_47 00:06:06.783 element at address: 0x200000381680 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_23 00:06:06.784 element at address: 0x20000037e000 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_48 00:06:06.784 element at address: 0x20000037de40 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_49 00:06:06.784 element at address: 0x20000037dbc0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_24 00:06:06.784 element at address: 0x20000037a540 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_50 00:06:06.784 element at address: 0x20000037a380 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_51 00:06:06.784 element at address: 0x20000037a100 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_25 00:06:06.784 element at address: 0x200000376a80 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_52 00:06:06.784 element at address: 0x2000003768c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_53 00:06:06.784 element at address: 0x200000376640 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_26 00:06:06.784 element at address: 0x200000372fc0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_54 00:06:06.784 element at address: 0x200000372e00 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_55 00:06:06.784 element at address: 0x200000372b80 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_27 00:06:06.784 element at address: 0x20000036f500 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_56 00:06:06.784 element at address: 0x20000036f340 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_57 00:06:06.784 element at address: 0x20000036f0c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_28 00:06:06.784 element at address: 0x20000036ba40 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_58 00:06:06.784 element at address: 0x20000036b880 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_59 00:06:06.784 element at address: 0x20000036b600 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_29 00:06:06.784 element at address: 0x200000367f80 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_60 00:06:06.784 element at address: 0x200000367dc0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_61 00:06:06.784 element at address: 0x200000367b40 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_30 00:06:06.784 element at address: 0x2000003644c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_62 00:06:06.784 element at address: 0x200000364300 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_63 00:06:06.784 element at address: 0x200000364080 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_31 00:06:06.784 element at address: 0x200000360a00 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_64 00:06:06.784 element at address: 0x200000360840 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_65 00:06:06.784 element at address: 0x2000003605c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_32 00:06:06.784 element at address: 0x20000035cf40 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_66 00:06:06.784 element at address: 0x20000035cd80 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_67 00:06:06.784 element at address: 0x20000035cb00 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_33 00:06:06.784 element at address: 0x200000359480 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_68 00:06:06.784 element at address: 0x2000003592c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_69 00:06:06.784 element at address: 0x200000359040 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_34 00:06:06.784 element at address: 0x2000003559c0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_70 00:06:06.784 element at address: 0x200000355800 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_71 00:06:06.784 element at address: 0x200000355580 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_35 00:06:06.784 element at address: 0x200000351f00 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_72 00:06:06.784 element at address: 0x200000351d40 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_73 00:06:06.784 element at address: 0x200000351ac0 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_36 00:06:06.784 element at address: 0x20000034e440 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_74 00:06:06.784 element at address: 0x20000034e280 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_75 00:06:06.784 element at address: 0x20000034e000 with size: 0.000244 MiB 00:06:06.784 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_37 00:06:06.785 element at address: 0x20000034a980 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_76 00:06:06.785 element at address: 0x20000034a7c0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_77 00:06:06.785 element at address: 0x20000034a540 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_38 00:06:06.785 element at address: 0x200000346ec0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_78 00:06:06.785 element at address: 0x200000346d00 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_79 00:06:06.785 element at address: 0x200000346a80 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_39 00:06:06.785 element at address: 0x200000343400 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_80 00:06:06.785 element at address: 0x200000343240 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_81 00:06:06.785 element at address: 0x200000342fc0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_40 00:06:06.785 element at address: 0x20000033f940 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_82 00:06:06.785 element at address: 0x20000033f780 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_83 00:06:06.785 element at address: 0x20000033f500 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_41 00:06:06.785 element at address: 0x20000033be80 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_84 00:06:06.785 element at address: 0x20000033bcc0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_85 00:06:06.785 element at address: 0x20000033ba40 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_42 00:06:06.785 element at address: 0x2000003383c0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_86 00:06:06.785 element at address: 0x200000338200 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_87 00:06:06.785 element at address: 0x200000337f80 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_43 00:06:06.785 element at address: 0x200000334900 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_88 00:06:06.785 element at address: 0x200000334740 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_89 00:06:06.785 element at address: 0x2000003344c0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_44 00:06:06.785 element at address: 0x200000330e40 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_90 00:06:06.785 element at address: 0x200000330c80 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_91 00:06:06.785 element at address: 0x200000330a00 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_45 00:06:06.785 element at address: 0x20000032d380 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_92 00:06:06.785 element at address: 0x20000032d1c0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_93 00:06:06.785 element at address: 0x20000032cf40 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_46 00:06:06.785 element at address: 0x2000003298c0 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_94 00:06:06.785 element at address: 0x200000329700 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_cryptodev_data_95 00:06:06.785 element at address: 0x200000329480 with size: 0.000244 MiB 00:06:06.785 associated memzone info: size: 0.000122 MiB name: rte_compressdev_data_47 00:06:06.785 element at address: 0x2000003d5d00 with size: 0.000183 MiB 00:06:06.785 associated memzone info: size: 0.000061 MiB name: QAT_COMP_CAPA_GEN_1 00:06:06.785 15:45:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@25 -- # trap - SIGINT SIGTERM EXIT 00:06:06.785 15:45:12 dpdk_mem_utility -- dpdk_memory_utility/test_dpdk_mem_info.sh@26 -- # killprocess 2592923 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@949 -- # '[' -z 2592923 ']' 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@953 -- # kill -0 2592923 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@954 -- # uname 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2592923 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2592923' 00:06:06.785 killing process with pid 2592923 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@968 -- # kill 2592923 00:06:06.785 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@973 -- # wait 2592923 00:06:07.357 00:06:07.357 real 0m1.146s 00:06:07.357 user 0m1.202s 00:06:07.357 sys 0m0.426s 00:06:07.357 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:07.357 15:45:12 dpdk_mem_utility -- common/autotest_common.sh@10 -- # set +x 00:06:07.357 ************************************ 00:06:07.357 END TEST dpdk_mem_utility 00:06:07.357 ************************************ 00:06:07.357 15:45:12 -- spdk/autotest.sh@181 -- # run_test event /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:07.357 15:45:12 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:07.357 15:45:12 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:07.357 15:45:12 -- common/autotest_common.sh@10 -- # set +x 00:06:07.357 ************************************ 00:06:07.357 START TEST event 00:06:07.357 ************************************ 00:06:07.357 15:45:12 event -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event.sh 00:06:07.357 * Looking for test storage... 00:06:07.357 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event 00:06:07.357 15:45:12 event -- event/event.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:06:07.357 15:45:12 event -- bdev/nbd_common.sh@6 -- # set -e 00:06:07.357 15:45:12 event -- event/event.sh@45 -- # run_test event_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:07.357 15:45:12 event -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:07.357 15:45:12 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:07.357 15:45:12 event -- common/autotest_common.sh@10 -- # set +x 00:06:07.357 ************************************ 00:06:07.357 START TEST event_perf 00:06:07.357 ************************************ 00:06:07.357 15:45:12 event.event_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/event_perf/event_perf -m 0xF -t 1 00:06:07.357 Running I/O for 1 seconds...[2024-06-10 15:45:12.772166] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:07.357 [2024-06-10 15:45:12.772234] [ DPDK EAL parameters: event_perf --no-shconf -c 0xF --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2593202 ] 00:06:07.616 [2024-06-10 15:45:12.870219] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:07.616 [2024-06-10 15:45:12.966633] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:07.616 [2024-06-10 15:45:12.966729] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:06:07.616 [2024-06-10 15:45:12.966836] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:06:07.616 [2024-06-10 15:45:12.966837] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:08.553 Running I/O for 1 seconds... 00:06:08.553 lcore 0: 164208 00:06:08.553 lcore 1: 164206 00:06:08.553 lcore 2: 164206 00:06:08.553 lcore 3: 164209 00:06:08.553 done. 00:06:08.553 00:06:08.553 real 0m1.304s 00:06:08.553 user 0m4.188s 00:06:08.553 sys 0m0.110s 00:06:08.553 15:45:14 event.event_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:08.553 15:45:14 event.event_perf -- common/autotest_common.sh@10 -- # set +x 00:06:08.553 ************************************ 00:06:08.553 END TEST event_perf 00:06:08.553 ************************************ 00:06:08.812 15:45:14 event -- event/event.sh@46 -- # run_test event_reactor /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:08.812 15:45:14 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:06:08.812 15:45:14 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:08.812 15:45:14 event -- common/autotest_common.sh@10 -- # set +x 00:06:08.812 ************************************ 00:06:08.812 START TEST event_reactor 00:06:08.812 ************************************ 00:06:08.812 15:45:14 event.event_reactor -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor/reactor -t 1 00:06:08.812 [2024-06-10 15:45:14.138472] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:08.812 [2024-06-10 15:45:14.138523] [ DPDK EAL parameters: reactor --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2593455 ] 00:06:08.812 [2024-06-10 15:45:14.224942] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:08.812 [2024-06-10 15:45:14.312705] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:10.190 test_start 00:06:10.190 oneshot 00:06:10.190 tick 100 00:06:10.190 tick 100 00:06:10.190 tick 250 00:06:10.190 tick 100 00:06:10.190 tick 100 00:06:10.190 tick 250 00:06:10.190 tick 100 00:06:10.190 tick 500 00:06:10.190 tick 100 00:06:10.190 tick 100 00:06:10.190 tick 250 00:06:10.190 tick 100 00:06:10.190 tick 100 00:06:10.190 test_end 00:06:10.190 00:06:10.190 real 0m1.280s 00:06:10.190 user 0m1.180s 00:06:10.190 sys 0m0.094s 00:06:10.190 15:45:15 event.event_reactor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:10.190 15:45:15 event.event_reactor -- common/autotest_common.sh@10 -- # set +x 00:06:10.190 ************************************ 00:06:10.190 END TEST event_reactor 00:06:10.190 ************************************ 00:06:10.190 15:45:15 event -- event/event.sh@47 -- # run_test event_reactor_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:10.190 15:45:15 event -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:06:10.190 15:45:15 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:10.190 15:45:15 event -- common/autotest_common.sh@10 -- # set +x 00:06:10.190 ************************************ 00:06:10.190 START TEST event_reactor_perf 00:06:10.190 ************************************ 00:06:10.190 15:45:15 event.event_reactor_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/reactor_perf/reactor_perf -t 1 00:06:10.190 [2024-06-10 15:45:15.491009] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:10.190 [2024-06-10 15:45:15.491077] [ DPDK EAL parameters: reactor_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2593700 ] 00:06:10.190 [2024-06-10 15:45:15.590902] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:10.190 [2024-06-10 15:45:15.678378] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.569 test_start 00:06:11.569 test_end 00:06:11.569 Performance: 298988 events per second 00:06:11.569 00:06:11.569 real 0m1.296s 00:06:11.569 user 0m1.183s 00:06:11.569 sys 0m0.106s 00:06:11.569 15:45:16 event.event_reactor_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:11.569 15:45:16 event.event_reactor_perf -- common/autotest_common.sh@10 -- # set +x 00:06:11.569 ************************************ 00:06:11.569 END TEST event_reactor_perf 00:06:11.569 ************************************ 00:06:11.569 15:45:16 event -- event/event.sh@49 -- # uname -s 00:06:11.569 15:45:16 event -- event/event.sh@49 -- # '[' Linux = Linux ']' 00:06:11.569 15:45:16 event -- event/event.sh@50 -- # run_test event_scheduler /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:11.569 15:45:16 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:11.569 15:45:16 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:11.569 15:45:16 event -- common/autotest_common.sh@10 -- # set +x 00:06:11.569 ************************************ 00:06:11.569 START TEST event_scheduler 00:06:11.569 ************************************ 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler.sh 00:06:11.569 * Looking for test storage... 00:06:11.569 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler 00:06:11.569 15:45:16 event.event_scheduler -- scheduler/scheduler.sh@29 -- # rpc=rpc_cmd 00:06:11.569 15:45:16 event.event_scheduler -- scheduler/scheduler.sh@35 -- # scheduler_pid=2593972 00:06:11.569 15:45:16 event.event_scheduler -- scheduler/scheduler.sh@36 -- # trap 'killprocess $scheduler_pid; exit 1' SIGINT SIGTERM EXIT 00:06:11.569 15:45:16 event.event_scheduler -- scheduler/scheduler.sh@37 -- # waitforlisten 2593972 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@830 -- # '[' -z 2593972 ']' 00:06:11.569 15:45:16 event.event_scheduler -- scheduler/scheduler.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/scheduler/scheduler -m 0xF -p 0x2 --wait-for-rpc -f 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:11.569 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:11.569 15:45:16 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:11.569 [2024-06-10 15:45:16.978695] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:11.569 [2024-06-10 15:45:16.978756] [ DPDK EAL parameters: scheduler --no-shconf -c 0xF --main-lcore=2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2593972 ] 00:06:11.569 [2024-06-10 15:45:17.051889] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:06:11.829 [2024-06-10 15:45:17.130436] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:11.829 [2024-06-10 15:45:17.130525] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:11.829 [2024-06-10 15:45:17.130634] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:06:11.829 [2024-06-10 15:45:17.130635] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@863 -- # return 0 00:06:11.829 15:45:17 event.event_scheduler -- scheduler/scheduler.sh@39 -- # rpc_cmd framework_set_scheduler dynamic 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:11.829 POWER: Env isn't set yet! 00:06:11.829 POWER: Attempting to initialise ACPI cpufreq power management... 00:06:11.829 POWER: Failed to write /sys/devices/system/cpu/cpu%u/cpufreq/scaling_governor 00:06:11.829 POWER: Cannot set governor of lcore 0 to userspace 00:06:11.829 POWER: Attempting to initialise PSTAT power management... 00:06:11.829 POWER: Power management governor of lcore 0 has been set to 'performance' successfully 00:06:11.829 POWER: Initialized successfully for lcore 0 power management 00:06:11.829 POWER: Power management governor of lcore 1 has been set to 'performance' successfully 00:06:11.829 POWER: Initialized successfully for lcore 1 power management 00:06:11.829 POWER: Power management governor of lcore 2 has been set to 'performance' successfully 00:06:11.829 POWER: Initialized successfully for lcore 2 power management 00:06:11.829 POWER: Power management governor of lcore 3 has been set to 'performance' successfully 00:06:11.829 POWER: Initialized successfully for lcore 3 power management 00:06:11.829 [2024-06-10 15:45:17.222094] scheduler_dynamic.c: 382:set_opts: *NOTICE*: Setting scheduler load limit to 20 00:06:11.829 [2024-06-10 15:45:17.222108] scheduler_dynamic.c: 384:set_opts: *NOTICE*: Setting scheduler core limit to 80 00:06:11.829 [2024-06-10 15:45:17.222117] scheduler_dynamic.c: 386:set_opts: *NOTICE*: Setting scheduler core busy to 95 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:11.829 15:45:17 event.event_scheduler -- scheduler/scheduler.sh@40 -- # rpc_cmd framework_start_init 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:11.829 [2024-06-10 15:45:17.308433] scheduler.c: 382:test_start: *NOTICE*: Scheduler test application started. 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:11.829 15:45:17 event.event_scheduler -- scheduler/scheduler.sh@43 -- # run_test scheduler_create_thread scheduler_create_thread 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:11.829 15:45:17 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:12.088 ************************************ 00:06:12.088 START TEST scheduler_create_thread 00:06:12.088 ************************************ 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1124 -- # scheduler_create_thread 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@12 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x1 -a 100 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.088 2 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@13 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x2 -a 100 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.088 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.088 3 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@14 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x4 -a 100 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 4 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@15 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n active_pinned -m 0x8 -a 100 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 5 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@16 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x1 -a 0 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 6 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@17 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x2 -a 0 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 7 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@18 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x4 -a 0 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 8 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@19 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n idle_pinned -m 0x8 -a 0 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 9 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@21 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n one_third_active -a 30 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 10 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n half_active -a 0 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@22 -- # thread_id=11 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@23 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_set_active 11 50 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_create -n deleted -a 100 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:12.089 15:45:17 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:13.465 15:45:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:13.465 15:45:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@25 -- # thread_id=12 00:06:13.465 15:45:18 event.event_scheduler.scheduler_create_thread -- scheduler/scheduler.sh@26 -- # rpc_cmd --plugin scheduler_plugin scheduler_thread_delete 12 00:06:13.465 15:45:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:13.465 15:45:18 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:14.843 15:45:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:14.843 00:06:14.843 real 0m2.619s 00:06:14.843 user 0m0.022s 00:06:14.843 sys 0m0.006s 00:06:14.843 15:45:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:14.843 15:45:19 event.event_scheduler.scheduler_create_thread -- common/autotest_common.sh@10 -- # set +x 00:06:14.843 ************************************ 00:06:14.843 END TEST scheduler_create_thread 00:06:14.843 ************************************ 00:06:14.843 15:45:19 event.event_scheduler -- scheduler/scheduler.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:06:14.843 15:45:19 event.event_scheduler -- scheduler/scheduler.sh@46 -- # killprocess 2593972 00:06:14.843 15:45:19 event.event_scheduler -- common/autotest_common.sh@949 -- # '[' -z 2593972 ']' 00:06:14.843 15:45:19 event.event_scheduler -- common/autotest_common.sh@953 -- # kill -0 2593972 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@954 -- # uname 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2593972 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@955 -- # process_name=reactor_2 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@959 -- # '[' reactor_2 = sudo ']' 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2593972' 00:06:14.843 killing process with pid 2593972 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@968 -- # kill 2593972 00:06:14.843 15:45:20 event.event_scheduler -- common/autotest_common.sh@973 -- # wait 2593972 00:06:15.103 [2024-06-10 15:45:20.446825] scheduler.c: 360:test_shutdown: *NOTICE*: Scheduler test application stopped. 00:06:15.103 POWER: Power management governor of lcore 0 has been set to 'powersave' successfully 00:06:15.103 POWER: Power management of lcore 0 has exited from 'performance' mode and been set back to the original 00:06:15.103 POWER: Power management governor of lcore 1 has been set to 'powersave' successfully 00:06:15.103 POWER: Power management of lcore 1 has exited from 'performance' mode and been set back to the original 00:06:15.103 POWER: Power management governor of lcore 2 has been set to 'powersave' successfully 00:06:15.103 POWER: Power management of lcore 2 has exited from 'performance' mode and been set back to the original 00:06:15.103 POWER: Power management governor of lcore 3 has been set to 'powersave' successfully 00:06:15.103 POWER: Power management of lcore 3 has exited from 'performance' mode and been set back to the original 00:06:15.363 00:06:15.363 real 0m3.809s 00:06:15.363 user 0m5.778s 00:06:15.363 sys 0m0.381s 00:06:15.363 15:45:20 event.event_scheduler -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:15.363 15:45:20 event.event_scheduler -- common/autotest_common.sh@10 -- # set +x 00:06:15.363 ************************************ 00:06:15.363 END TEST event_scheduler 00:06:15.363 ************************************ 00:06:15.363 15:45:20 event -- event/event.sh@51 -- # modprobe -n nbd 00:06:15.363 15:45:20 event -- event/event.sh@52 -- # run_test app_repeat app_repeat_test 00:06:15.363 15:45:20 event -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:15.363 15:45:20 event -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:15.363 15:45:20 event -- common/autotest_common.sh@10 -- # set +x 00:06:15.363 ************************************ 00:06:15.363 START TEST app_repeat 00:06:15.363 ************************************ 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@1124 -- # app_repeat_test 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@12 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@13 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@13 -- # local nbd_list 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@14 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@14 -- # local bdev_list 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@15 -- # local repeat_times=4 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@17 -- # modprobe nbd 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@19 -- # repeat_pid=2594653 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@20 -- # trap 'killprocess $repeat_pid; exit 1' SIGINT SIGTERM EXIT 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@18 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/app_repeat/app_repeat -r /var/tmp/spdk-nbd.sock -m 0x3 -t 4 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@21 -- # echo 'Process app_repeat pid: 2594653' 00:06:15.363 Process app_repeat pid: 2594653 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 0' 00:06:15.363 spdk_app_start Round 0 00:06:15.363 15:45:20 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2594653 /var/tmp/spdk-nbd.sock 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 2594653 ']' 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:15.363 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:15.363 15:45:20 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:15.363 [2024-06-10 15:45:20.756444] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:15.363 [2024-06-10 15:45:20.756497] [ DPDK EAL parameters: app_repeat --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2594653 ] 00:06:15.363 [2024-06-10 15:45:20.853855] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:15.622 [2024-06-10 15:45:20.951063] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:15.622 [2024-06-10 15:45:20.951070] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:15.622 15:45:21 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:15.622 15:45:21 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:15.622 15:45:21 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:15.881 Malloc0 00:06:15.881 15:45:21 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:16.141 Malloc1 00:06:16.141 15:45:21 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.141 15:45:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:16.401 /dev/nbd0 00:06:16.401 15:45:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:16.401 15:45:21 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:16.401 1+0 records in 00:06:16.401 1+0 records out 00:06:16.401 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242652 s, 16.9 MB/s 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:16.401 15:45:21 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:16.401 15:45:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:16.401 15:45:21 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.401 15:45:21 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:16.661 /dev/nbd1 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:16.661 1+0 records in 00:06:16.661 1+0 records out 00:06:16.661 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000288928 s, 14.2 MB/s 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:16.661 15:45:22 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:16.661 15:45:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:16.921 { 00:06:16.921 "nbd_device": "/dev/nbd0", 00:06:16.921 "bdev_name": "Malloc0" 00:06:16.921 }, 00:06:16.921 { 00:06:16.921 "nbd_device": "/dev/nbd1", 00:06:16.921 "bdev_name": "Malloc1" 00:06:16.921 } 00:06:16.921 ]' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:16.921 { 00:06:16.921 "nbd_device": "/dev/nbd0", 00:06:16.921 "bdev_name": "Malloc0" 00:06:16.921 }, 00:06:16.921 { 00:06:16.921 "nbd_device": "/dev/nbd1", 00:06:16.921 "bdev_name": "Malloc1" 00:06:16.921 } 00:06:16.921 ]' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:16.921 /dev/nbd1' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:16.921 /dev/nbd1' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:16.921 256+0 records in 00:06:16.921 256+0 records out 00:06:16.921 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00996473 s, 105 MB/s 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:16.921 15:45:22 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:17.180 256+0 records in 00:06:17.180 256+0 records out 00:06:17.180 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0206151 s, 50.9 MB/s 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:17.180 256+0 records in 00:06:17.180 256+0 records out 00:06:17.180 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0220171 s, 47.6 MB/s 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:17.180 15:45:22 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:17.439 15:45:22 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:17.698 15:45:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:17.957 15:45:23 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:17.957 15:45:23 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:18.216 15:45:23 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:18.502 [2024-06-10 15:45:23.837216] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:18.502 [2024-06-10 15:45:23.923007] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:18.502 [2024-06-10 15:45:23.923013] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:18.502 [2024-06-10 15:45:23.968335] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:18.502 [2024-06-10 15:45:23.968380] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:21.819 15:45:26 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:21.819 15:45:26 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 1' 00:06:21.819 spdk_app_start Round 1 00:06:21.819 15:45:26 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2594653 /var/tmp/spdk-nbd.sock 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 2594653 ']' 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:21.819 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:21.819 15:45:26 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:21.819 15:45:26 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.819 Malloc0 00:06:21.819 15:45:26 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:21.819 Malloc1 00:06:21.819 15:45:27 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:21.819 /dev/nbd0 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:21.819 1+0 records in 00:06:21.819 1+0 records out 00:06:21.819 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000193339 s, 21.2 MB/s 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:21.819 15:45:27 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:21.819 15:45:27 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:22.078 /dev/nbd1 00:06:22.078 15:45:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:22.078 15:45:27 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:22.078 15:45:27 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:22.078 15:45:27 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:22.078 15:45:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:22.078 15:45:27 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:22.078 15:45:27 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:22.337 15:45:27 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:22.337 15:45:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:22.337 15:45:27 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:22.337 15:45:27 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:22.337 1+0 records in 00:06:22.337 1+0 records out 00:06:22.337 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000246066 s, 16.6 MB/s 00:06:22.337 15:45:27 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:22.338 15:45:27 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:22.338 15:45:27 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:22.338 15:45:27 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:22.338 15:45:27 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:22.338 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:22.338 15:45:27 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:22.338 15:45:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:22.338 15:45:27 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.338 15:45:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:22.596 15:45:27 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:22.596 { 00:06:22.596 "nbd_device": "/dev/nbd0", 00:06:22.596 "bdev_name": "Malloc0" 00:06:22.597 }, 00:06:22.597 { 00:06:22.597 "nbd_device": "/dev/nbd1", 00:06:22.597 "bdev_name": "Malloc1" 00:06:22.597 } 00:06:22.597 ]' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:22.597 { 00:06:22.597 "nbd_device": "/dev/nbd0", 00:06:22.597 "bdev_name": "Malloc0" 00:06:22.597 }, 00:06:22.597 { 00:06:22.597 "nbd_device": "/dev/nbd1", 00:06:22.597 "bdev_name": "Malloc1" 00:06:22.597 } 00:06:22.597 ]' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:22.597 /dev/nbd1' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:22.597 /dev/nbd1' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:22.597 256+0 records in 00:06:22.597 256+0 records out 00:06:22.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00972745 s, 108 MB/s 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:22.597 256+0 records in 00:06:22.597 256+0 records out 00:06:22.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0205904 s, 50.9 MB/s 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:22.597 256+0 records in 00:06:22.597 256+0 records out 00:06:22.597 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0216579 s, 48.4 MB/s 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.597 15:45:27 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:22.856 15:45:28 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:23.114 15:45:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:23.373 15:45:28 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:23.373 15:45:28 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:23.634 15:45:29 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:23.894 [2024-06-10 15:45:29.353154] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:24.153 [2024-06-10 15:45:29.439874] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:24.153 [2024-06-10 15:45:29.439880] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:24.153 [2024-06-10 15:45:29.486127] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:24.153 [2024-06-10 15:45:29.486178] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:26.688 15:45:32 event.app_repeat -- event/event.sh@23 -- # for i in {0..2} 00:06:26.688 15:45:32 event.app_repeat -- event/event.sh@24 -- # echo 'spdk_app_start Round 2' 00:06:26.688 spdk_app_start Round 2 00:06:26.688 15:45:32 event.app_repeat -- event/event.sh@25 -- # waitforlisten 2594653 /var/tmp/spdk-nbd.sock 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 2594653 ']' 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:26.688 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:26.688 15:45:32 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:26.946 15:45:32 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:26.946 15:45:32 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:26.946 15:45:32 event.app_repeat -- event/event.sh@27 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:27.205 Malloc0 00:06:27.205 15:45:32 event.app_repeat -- event/event.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create 64 4096 00:06:27.464 Malloc1 00:06:27.464 15:45:32 event.app_repeat -- event/event.sh@30 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@91 -- # local bdev_list 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@92 -- # local nbd_list 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1' '/dev/nbd0 /dev/nbd1' 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1') 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@12 -- # local i 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:06:27.464 /dev/nbd0 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:27.464 1+0 records in 00:06:27.464 1+0 records out 00:06:27.464 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234401 s, 17.5 MB/s 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:27.464 15:45:32 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:27.464 15:45:32 event.app_repeat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1 /dev/nbd1 00:06:27.724 /dev/nbd1 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@868 -- # local i 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@872 -- # break 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest bs=4096 count=1 iflag=direct 00:06:27.724 1+0 records in 00:06:27.724 1+0 records out 00:06:27.724 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000233327 s, 17.6 MB/s 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@885 -- # size=4096 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdtest 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:06:27.724 15:45:33 event.app_repeat -- common/autotest_common.sh@888 -- # return 0 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:27.724 15:45:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:27.983 15:45:33 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:06:27.983 { 00:06:27.983 "nbd_device": "/dev/nbd0", 00:06:27.983 "bdev_name": "Malloc0" 00:06:27.983 }, 00:06:27.983 { 00:06:27.983 "nbd_device": "/dev/nbd1", 00:06:27.983 "bdev_name": "Malloc1" 00:06:27.983 } 00:06:27.983 ]' 00:06:27.983 15:45:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[ 00:06:27.983 { 00:06:27.983 "nbd_device": "/dev/nbd0", 00:06:27.983 "bdev_name": "Malloc0" 00:06:27.983 }, 00:06:27.983 { 00:06:27.983 "nbd_device": "/dev/nbd1", 00:06:27.983 "bdev_name": "Malloc1" 00:06:27.983 } 00:06:27.983 ]' 00:06:27.983 15:45:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:06:28.242 /dev/nbd1' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:06:28.242 /dev/nbd1' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=2 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 2 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@95 -- # count=2 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=write 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest bs=4096 count=256 00:06:28.242 256+0 records in 00:06:28.242 256+0 records out 00:06:28.242 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00988188 s, 106 MB/s 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:06:28.242 256+0 records in 00:06:28.242 256+0 records out 00:06:28.242 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0205653 s, 51.0 MB/s 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:06:28.242 256+0 records in 00:06:28.242 256+0 records out 00:06:28.242 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0218095 s, 48.1 MB/s 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@70 -- # local nbd_list 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@71 -- # local operation=verify 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd0 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest /dev/nbd1 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/event/nbdrandtest 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@51 -- # local i 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:28.242 15:45:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:06:28.501 15:45:33 event.app_repeat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@41 -- # break 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@45 -- # return 0 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:06:28.759 15:45:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:06:29.017 15:45:34 event.app_repeat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:06:29.017 15:45:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:06:29.017 15:45:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:06:29.017 15:45:34 event.app_repeat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # echo '' 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # true 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@65 -- # count=0 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@66 -- # echo 0 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@104 -- # count=0 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:06:29.018 15:45:34 event.app_repeat -- bdev/nbd_common.sh@109 -- # return 0 00:06:29.018 15:45:34 event.app_repeat -- event/event.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock spdk_kill_instance SIGTERM 00:06:29.277 15:45:34 event.app_repeat -- event/event.sh@35 -- # sleep 3 00:06:29.536 [2024-06-10 15:45:34.976198] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:06:29.795 [2024-06-10 15:45:35.062638] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:06:29.795 [2024-06-10 15:45:35.062644] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:29.795 [2024-06-10 15:45:35.107299] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_register' already registered. 00:06:29.795 [2024-06-10 15:45:35.107344] notify.c: 45:spdk_notify_type_register: *NOTICE*: Notification type 'bdev_unregister' already registered. 00:06:32.372 15:45:37 event.app_repeat -- event/event.sh@38 -- # waitforlisten 2594653 /var/tmp/spdk-nbd.sock 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@830 -- # '[' -z 2594653 ']' 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:06:32.372 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:32.372 15:45:37 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@863 -- # return 0 00:06:32.631 15:45:38 event.app_repeat -- event/event.sh@39 -- # killprocess 2594653 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@949 -- # '[' -z 2594653 ']' 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@953 -- # kill -0 2594653 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@954 -- # uname 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2594653 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2594653' 00:06:32.631 killing process with pid 2594653 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@968 -- # kill 2594653 00:06:32.631 15:45:38 event.app_repeat -- common/autotest_common.sh@973 -- # wait 2594653 00:06:32.890 spdk_app_start is called in Round 0. 00:06:32.890 Shutdown signal received, stop current app iteration 00:06:32.891 Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 reinitialization... 00:06:32.891 spdk_app_start is called in Round 1. 00:06:32.891 Shutdown signal received, stop current app iteration 00:06:32.891 Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 reinitialization... 00:06:32.891 spdk_app_start is called in Round 2. 00:06:32.891 Shutdown signal received, stop current app iteration 00:06:32.891 Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 reinitialization... 00:06:32.891 spdk_app_start is called in Round 3. 00:06:32.891 Shutdown signal received, stop current app iteration 00:06:32.891 15:45:38 event.app_repeat -- event/event.sh@40 -- # trap - SIGINT SIGTERM EXIT 00:06:32.891 15:45:38 event.app_repeat -- event/event.sh@42 -- # return 0 00:06:32.891 00:06:32.891 real 0m17.519s 00:06:32.891 user 0m38.594s 00:06:32.891 sys 0m2.828s 00:06:32.891 15:45:38 event.app_repeat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:32.891 15:45:38 event.app_repeat -- common/autotest_common.sh@10 -- # set +x 00:06:32.891 ************************************ 00:06:32.891 END TEST app_repeat 00:06:32.891 ************************************ 00:06:32.891 15:45:38 event -- event/event.sh@54 -- # (( SPDK_TEST_CRYPTO == 0 )) 00:06:32.891 00:06:32.891 real 0m25.641s 00:06:32.891 user 0m51.091s 00:06:32.891 sys 0m3.815s 00:06:32.891 15:45:38 event -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:32.891 15:45:38 event -- common/autotest_common.sh@10 -- # set +x 00:06:32.891 ************************************ 00:06:32.891 END TEST event 00:06:32.891 ************************************ 00:06:32.891 15:45:38 -- spdk/autotest.sh@182 -- # run_test thread /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:32.891 15:45:38 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:32.891 15:45:38 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:32.891 15:45:38 -- common/autotest_common.sh@10 -- # set +x 00:06:32.891 ************************************ 00:06:32.891 START TEST thread 00:06:32.891 ************************************ 00:06:32.891 15:45:38 thread -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/thread.sh 00:06:33.150 * Looking for test storage... 00:06:33.150 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread 00:06:33.150 15:45:38 thread -- thread/thread.sh@11 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:33.150 15:45:38 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:33.150 15:45:38 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:33.150 15:45:38 thread -- common/autotest_common.sh@10 -- # set +x 00:06:33.150 ************************************ 00:06:33.150 START TEST thread_poller_perf 00:06:33.150 ************************************ 00:06:33.150 15:45:38 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 1 -t 1 00:06:33.150 [2024-06-10 15:45:38.489435] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:33.150 [2024-06-10 15:45:38.489490] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2597738 ] 00:06:33.150 [2024-06-10 15:45:38.589487] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:33.410 [2024-06-10 15:45:38.682708] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:33.410 Running 1000 pollers for 1 seconds with 1 microseconds period. 00:06:34.348 ====================================== 00:06:34.348 busy:2111279392 (cyc) 00:06:34.348 total_run_count: 244000 00:06:34.348 tsc_hz: 2100000000 (cyc) 00:06:34.348 ====================================== 00:06:34.348 poller_cost: 8652 (cyc), 4120 (nsec) 00:06:34.348 00:06:34.348 real 0m1.311s 00:06:34.348 user 0m1.204s 00:06:34.348 sys 0m0.102s 00:06:34.348 15:45:39 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:34.348 15:45:39 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:34.348 ************************************ 00:06:34.348 END TEST thread_poller_perf 00:06:34.348 ************************************ 00:06:34.348 15:45:39 thread -- thread/thread.sh@12 -- # run_test thread_poller_perf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:34.348 15:45:39 thread -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:34.348 15:45:39 thread -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:34.348 15:45:39 thread -- common/autotest_common.sh@10 -- # set +x 00:06:34.348 ************************************ 00:06:34.348 START TEST thread_poller_perf 00:06:34.348 ************************************ 00:06:34.348 15:45:39 thread.thread_poller_perf -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/thread/poller_perf/poller_perf -b 1000 -l 0 -t 1 00:06:34.608 [2024-06-10 15:45:39.865738] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:34.608 [2024-06-10 15:45:39.865790] [ DPDK EAL parameters: poller_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2597989 ] 00:06:34.608 [2024-06-10 15:45:39.965489] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:34.608 [2024-06-10 15:45:40.063887] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:34.608 Running 1000 pollers for 1 seconds with 0 microseconds period. 00:06:35.987 ====================================== 00:06:35.987 busy:2102563138 (cyc) 00:06:35.987 total_run_count: 3193000 00:06:35.987 tsc_hz: 2100000000 (cyc) 00:06:35.987 ====================================== 00:06:35.987 poller_cost: 658 (cyc), 313 (nsec) 00:06:35.987 00:06:35.987 real 0m1.308s 00:06:35.987 user 0m1.193s 00:06:35.987 sys 0m0.109s 00:06:35.987 15:45:41 thread.thread_poller_perf -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:35.987 15:45:41 thread.thread_poller_perf -- common/autotest_common.sh@10 -- # set +x 00:06:35.987 ************************************ 00:06:35.987 END TEST thread_poller_perf 00:06:35.987 ************************************ 00:06:35.987 15:45:41 thread -- thread/thread.sh@17 -- # [[ y != \y ]] 00:06:35.987 00:06:35.987 real 0m2.841s 00:06:35.987 user 0m2.478s 00:06:35.987 sys 0m0.371s 00:06:35.987 15:45:41 thread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:35.987 15:45:41 thread -- common/autotest_common.sh@10 -- # set +x 00:06:35.987 ************************************ 00:06:35.987 END TEST thread 00:06:35.987 ************************************ 00:06:35.987 15:45:41 -- spdk/autotest.sh@183 -- # run_test accel /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:35.987 15:45:41 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:06:35.987 15:45:41 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:35.987 15:45:41 -- common/autotest_common.sh@10 -- # set +x 00:06:35.987 ************************************ 00:06:35.987 START TEST accel 00:06:35.987 ************************************ 00:06:35.987 15:45:41 accel -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel.sh 00:06:35.987 * Looking for test storage... 00:06:35.987 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:06:35.987 15:45:41 accel -- accel/accel.sh@81 -- # declare -A expected_opcs 00:06:35.987 15:45:41 accel -- accel/accel.sh@82 -- # get_expected_opcs 00:06:35.987 15:45:41 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:06:35.987 15:45:41 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2598348 00:06:35.987 15:45:41 accel -- accel/accel.sh@63 -- # waitforlisten 2598348 00:06:35.987 15:45:41 accel -- common/autotest_common.sh@830 -- # '[' -z 2598348 ']' 00:06:35.987 15:45:41 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:06:35.987 15:45:41 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:06:35.987 15:45:41 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:06:35.987 15:45:41 accel -- accel/accel.sh@61 -- # build_accel_config 00:06:35.988 15:45:41 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:06:35.988 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:06:35.988 15:45:41 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:06:35.988 15:45:41 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:35.988 15:45:41 accel -- common/autotest_common.sh@10 -- # set +x 00:06:35.988 15:45:41 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:35.988 15:45:41 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:35.988 15:45:41 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:35.988 15:45:41 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:35.988 15:45:41 accel -- accel/accel.sh@40 -- # local IFS=, 00:06:35.988 15:45:41 accel -- accel/accel.sh@41 -- # jq -r . 00:06:35.988 [2024-06-10 15:45:41.409644] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:35.988 [2024-06-10 15:45:41.409707] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598348 ] 00:06:36.248 [2024-06-10 15:45:41.508326] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:36.248 [2024-06-10 15:45:41.605846] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@863 -- # return 0 00:06:37.184 15:45:42 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:06:37.184 15:45:42 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:06:37.184 15:45:42 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:06:37.184 15:45:42 accel -- accel/accel.sh@68 -- # [[ -n '' ]] 00:06:37.184 15:45:42 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:06:37.184 15:45:42 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:06:37.184 15:45:42 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # IFS== 00:06:37.184 15:45:42 accel -- accel/accel.sh@72 -- # read -r opc module 00:06:37.184 15:45:42 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:06:37.184 15:45:42 accel -- accel/accel.sh@75 -- # killprocess 2598348 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@949 -- # '[' -z 2598348 ']' 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@953 -- # kill -0 2598348 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@954 -- # uname 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:06:37.184 15:45:42 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2598348 00:06:37.185 15:45:42 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:06:37.185 15:45:42 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:06:37.185 15:45:42 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2598348' 00:06:37.185 killing process with pid 2598348 00:06:37.185 15:45:42 accel -- common/autotest_common.sh@968 -- # kill 2598348 00:06:37.185 15:45:42 accel -- common/autotest_common.sh@973 -- # wait 2598348 00:06:37.445 15:45:42 accel -- accel/accel.sh@76 -- # trap - ERR 00:06:37.445 15:45:42 accel -- accel/accel.sh@89 -- # run_test accel_help accel_perf -h 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:37.445 15:45:42 accel.accel_help -- common/autotest_common.sh@1124 -- # accel_perf -h 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -h 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@12 -- # build_accel_config 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@40 -- # local IFS=, 00:06:37.445 15:45:42 accel.accel_help -- accel/accel.sh@41 -- # jq -r . 00:06:37.445 15:45:42 accel.accel_help -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:37.445 15:45:42 accel.accel_help -- common/autotest_common.sh@10 -- # set +x 00:06:37.445 15:45:42 accel -- accel/accel.sh@91 -- # run_test accel_missing_filename NOT accel_perf -t 1 -w compress 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:37.445 15:45:42 accel -- common/autotest_common.sh@10 -- # set +x 00:06:37.445 ************************************ 00:06:37.445 START TEST accel_missing_filename 00:06:37.445 ************************************ 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@649 -- # local es=0 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:37.445 15:45:42 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@12 -- # build_accel_config 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@40 -- # local IFS=, 00:06:37.445 15:45:42 accel.accel_missing_filename -- accel/accel.sh@41 -- # jq -r . 00:06:37.704 [2024-06-10 15:45:42.969294] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:37.704 [2024-06-10 15:45:42.969347] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598684 ] 00:06:37.704 [2024-06-10 15:45:43.068327] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:37.704 [2024-06-10 15:45:43.158585] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:37.963 [2024-06-10 15:45:43.219317] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:37.963 [2024-06-10 15:45:43.282821] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:37.963 A filename is required. 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@652 -- # es=234 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@661 -- # es=106 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@662 -- # case "$es" in 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@669 -- # es=1 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:37.963 00:06:37.963 real 0m0.431s 00:06:37.963 user 0m0.306s 00:06:37.963 sys 0m0.159s 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:37.963 15:45:43 accel.accel_missing_filename -- common/autotest_common.sh@10 -- # set +x 00:06:37.963 ************************************ 00:06:37.963 END TEST accel_missing_filename 00:06:37.963 ************************************ 00:06:37.963 15:45:43 accel -- accel/accel.sh@93 -- # run_test accel_compress_verify NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:37.963 15:45:43 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:06:37.963 15:45:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:37.963 15:45:43 accel -- common/autotest_common.sh@10 -- # set +x 00:06:37.963 ************************************ 00:06:37.963 START TEST accel_compress_verify 00:06:37.963 ************************************ 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@649 -- # local es=0 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:37.963 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:37.964 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:37.964 15:45:43 accel.accel_compress_verify -- accel/accel.sh@41 -- # jq -r . 00:06:37.964 [2024-06-10 15:45:43.463288] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:37.964 [2024-06-10 15:45:43.463338] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598711 ] 00:06:38.223 [2024-06-10 15:45:43.560504] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:38.223 [2024-06-10 15:45:43.650760] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:38.223 [2024-06-10 15:45:43.710414] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:06:38.482 [2024-06-10 15:45:43.775813] accel_perf.c:1464:main: *ERROR*: ERROR starting application 00:06:38.482 00:06:38.482 Compression does not support the verify option, aborting. 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@652 -- # es=161 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@661 -- # es=33 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@662 -- # case "$es" in 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@669 -- # es=1 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:38.482 00:06:38.482 real 0m0.433s 00:06:38.482 user 0m0.320s 00:06:38.482 sys 0m0.149s 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:38.482 15:45:43 accel.accel_compress_verify -- common/autotest_common.sh@10 -- # set +x 00:06:38.482 ************************************ 00:06:38.482 END TEST accel_compress_verify 00:06:38.482 ************************************ 00:06:38.482 15:45:43 accel -- accel/accel.sh@95 -- # run_test accel_wrong_workload NOT accel_perf -t 1 -w foobar 00:06:38.482 15:45:43 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:38.482 15:45:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:38.482 15:45:43 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.482 ************************************ 00:06:38.482 START TEST accel_wrong_workload 00:06:38.482 ************************************ 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w foobar 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@649 -- # local es=0 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w foobar 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w foobar 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w foobar 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@12 -- # build_accel_config 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@40 -- # local IFS=, 00:06:38.482 15:45:43 accel.accel_wrong_workload -- accel/accel.sh@41 -- # jq -r . 00:06:38.482 Unsupported workload type: foobar 00:06:38.482 [2024-06-10 15:45:43.959697] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'w' failed: 1 00:06:38.482 accel_perf options: 00:06:38.482 [-h help message] 00:06:38.482 [-q queue depth per core] 00:06:38.482 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.482 [-T number of threads per core 00:06:38.482 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.482 [-t time in seconds] 00:06:38.482 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.482 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:38.482 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.482 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.482 [-S for crc32c workload, use this seed value (default 0) 00:06:38.482 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.482 [-f for fill workload, use this BYTE value (default 255) 00:06:38.482 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.482 [-y verify result if this switch is on] 00:06:38.482 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.482 Can be used to spread operations across a wider range of memory. 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@652 -- # es=1 00:06:38.482 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:38.483 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:38.483 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:38.483 00:06:38.483 real 0m0.038s 00:06:38.483 user 0m0.024s 00:06:38.483 sys 0m0.014s 00:06:38.483 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:38.483 15:45:43 accel.accel_wrong_workload -- common/autotest_common.sh@10 -- # set +x 00:06:38.483 ************************************ 00:06:38.483 END TEST accel_wrong_workload 00:06:38.483 ************************************ 00:06:38.483 Error: writing output failed: Broken pipe 00:06:38.742 15:45:43 accel -- accel/accel.sh@97 -- # run_test accel_negative_buffers NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.742 15:45:43 accel -- common/autotest_common.sh@1100 -- # '[' 10 -le 1 ']' 00:06:38.742 15:45:43 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:38.742 15:45:43 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.742 ************************************ 00:06:38.742 START TEST accel_negative_buffers 00:06:38.742 ************************************ 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@1124 -- # NOT accel_perf -t 1 -w xor -y -x -1 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@649 -- # local es=0 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@651 -- # valid_exec_arg accel_perf -t 1 -w xor -y -x -1 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@637 -- # local arg=accel_perf 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # type -t accel_perf 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:06:38.742 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # accel_perf -t 1 -w xor -y -x -1 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x -1 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@12 -- # build_accel_config 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@40 -- # local IFS=, 00:06:38.742 15:45:44 accel.accel_negative_buffers -- accel/accel.sh@41 -- # jq -r . 00:06:38.742 -x option must be non-negative. 00:06:38.742 [2024-06-10 15:45:44.065721] app.c:1451:spdk_app_parse_args: *ERROR*: Parsing app-specific command line parameter 'x' failed: 1 00:06:38.742 accel_perf options: 00:06:38.742 [-h help message] 00:06:38.742 [-q queue depth per core] 00:06:38.742 [-C for supported workloads, use this value to configure the io vector size to test (default 1) 00:06:38.742 [-T number of threads per core 00:06:38.743 [-o transfer size in bytes (default: 4KiB. For compress/decompress, 0 means the input file size)] 00:06:38.743 [-t time in seconds] 00:06:38.743 [-w workload type must be one of these: copy, fill, crc32c, copy_crc32c, compare, compress, decompress, dualcast, xor, 00:06:38.743 [ dif_verify, dif_verify_copy, dif_generate, dif_generate_copy 00:06:38.743 [-M assign module to the operation, not compatible with accel_assign_opc RPC 00:06:38.743 [-l for compress/decompress workloads, name of uncompressed input file 00:06:38.743 [-S for crc32c workload, use this seed value (default 0) 00:06:38.743 [-P for compare workload, percentage of operations that should miscompare (percent, default 0) 00:06:38.743 [-f for fill workload, use this BYTE value (default 255) 00:06:38.743 [-x for xor workload, use this number of source buffers (default, minimum: 2)] 00:06:38.743 [-y verify result if this switch is on] 00:06:38.743 [-a tasks to allocate per core (default: same value as -q)] 00:06:38.743 Can be used to spread operations across a wider range of memory. 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@652 -- # es=1 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:06:38.743 00:06:38.743 real 0m0.040s 00:06:38.743 user 0m0.024s 00:06:38.743 sys 0m0.016s 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:38.743 15:45:44 accel.accel_negative_buffers -- common/autotest_common.sh@10 -- # set +x 00:06:38.743 ************************************ 00:06:38.743 END TEST accel_negative_buffers 00:06:38.743 ************************************ 00:06:38.743 Error: writing output failed: Broken pipe 00:06:38.743 15:45:44 accel -- accel/accel.sh@101 -- # run_test accel_crc32c accel_test -t 1 -w crc32c -S 32 -y 00:06:38.743 15:45:44 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:38.743 15:45:44 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:38.743 15:45:44 accel -- common/autotest_common.sh@10 -- # set +x 00:06:38.743 ************************************ 00:06:38.743 START TEST accel_crc32c 00:06:38.743 ************************************ 00:06:38.743 15:45:44 accel.accel_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -S 32 -y 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -S 32 -y 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -S 32 -y 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:38.743 15:45:44 accel.accel_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:38.743 [2024-06-10 15:45:44.158986] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:38.743 [2024-06-10 15:45:44.159037] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2598947 ] 00:06:39.004 [2024-06-10 15:45:44.253780] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:39.004 [2024-06-10 15:45:44.344010] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=crc32c 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=software 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=32 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=1 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:39.004 15:45:44 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:40.439 15:45:45 accel.accel_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:40.439 00:06:40.439 real 0m1.441s 00:06:40.439 user 0m1.283s 00:06:40.439 sys 0m0.157s 00:06:40.439 15:45:45 accel.accel_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:40.439 15:45:45 accel.accel_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:40.439 ************************************ 00:06:40.439 END TEST accel_crc32c 00:06:40.439 ************************************ 00:06:40.439 15:45:45 accel -- accel/accel.sh@102 -- # run_test accel_crc32c_C2 accel_test -t 1 -w crc32c -y -C 2 00:06:40.439 15:45:45 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:40.439 15:45:45 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:40.439 15:45:45 accel -- common/autotest_common.sh@10 -- # set +x 00:06:40.439 ************************************ 00:06:40.439 START TEST accel_crc32c_C2 00:06:40.439 ************************************ 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w crc32c -y -C 2 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w crc32c -y -C 2 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w crc32c -y -C 2 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:40.439 [2024-06-10 15:45:45.664895] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:40.439 [2024-06-10 15:45:45.664976] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599239 ] 00:06:40.439 [2024-06-10 15:45:45.760335] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:40.439 [2024-06-10 15:45:45.851335] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=crc32c 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=crc32c 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:40.439 15:45:45 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n crc32c ]] 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:41.818 00:06:41.818 real 0m1.432s 00:06:41.818 user 0m1.282s 00:06:41.818 sys 0m0.154s 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:41.818 15:45:47 accel.accel_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:41.818 ************************************ 00:06:41.818 END TEST accel_crc32c_C2 00:06:41.818 ************************************ 00:06:41.818 15:45:47 accel -- accel/accel.sh@103 -- # run_test accel_copy accel_test -t 1 -w copy -y 00:06:41.818 15:45:47 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:41.818 15:45:47 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:41.818 15:45:47 accel -- common/autotest_common.sh@10 -- # set +x 00:06:41.818 ************************************ 00:06:41.818 START TEST accel_copy 00:06:41.818 ************************************ 00:06:41.818 15:45:47 accel.accel_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy -y 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@17 -- # local accel_module 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy -y 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy -y 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:41.818 15:45:47 accel.accel_copy -- accel/accel.sh@41 -- # jq -r . 00:06:41.818 [2024-06-10 15:45:47.153856] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:41.818 [2024-06-10 15:45:47.153907] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599485 ] 00:06:41.818 [2024-06-10 15:45:47.241714] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:42.077 [2024-06-10 15:45:47.331713] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=0x1 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=copy 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@23 -- # accel_opc=copy 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=software 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.077 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=32 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=1 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val=Yes 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:42.078 15:45:47 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ -n copy ]] 00:06:43.456 15:45:48 accel.accel_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:43.456 00:06:43.456 real 0m1.425s 00:06:43.456 user 0m1.288s 00:06:43.456 sys 0m0.141s 00:06:43.456 15:45:48 accel.accel_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:43.456 15:45:48 accel.accel_copy -- common/autotest_common.sh@10 -- # set +x 00:06:43.456 ************************************ 00:06:43.456 END TEST accel_copy 00:06:43.456 ************************************ 00:06:43.456 15:45:48 accel -- accel/accel.sh@104 -- # run_test accel_fill accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:43.456 15:45:48 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:06:43.456 15:45:48 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:43.456 15:45:48 accel -- common/autotest_common.sh@10 -- # set +x 00:06:43.456 ************************************ 00:06:43.456 START TEST accel_fill 00:06:43.456 ************************************ 00:06:43.456 15:45:48 accel.accel_fill -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@16 -- # local accel_opc 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@17 -- # local accel_module 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@15 -- # accel_perf -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w fill -f 128 -q 64 -a 64 -y 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@12 -- # build_accel_config 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@40 -- # local IFS=, 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@41 -- # jq -r . 00:06:43.456 [2024-06-10 15:45:48.647500] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:43.456 [2024-06-10 15:45:48.647551] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599735 ] 00:06:43.456 [2024-06-10 15:45:48.734652] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:43.456 [2024-06-10 15:45:48.824992] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.456 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=0x1 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=fill 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@23 -- # accel_opc=fill 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=0x80 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=software 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@22 -- # accel_module=software 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=64 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=1 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val='1 seconds' 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val=Yes 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:43.457 15:45:48 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@20 -- # val= 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@21 -- # case "$var" in 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ -n fill ]] 00:06:44.836 15:45:50 accel.accel_fill -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:44.836 00:06:44.836 real 0m1.426s 00:06:44.836 user 0m1.291s 00:06:44.836 sys 0m0.138s 00:06:44.836 15:45:50 accel.accel_fill -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:44.836 15:45:50 accel.accel_fill -- common/autotest_common.sh@10 -- # set +x 00:06:44.836 ************************************ 00:06:44.836 END TEST accel_fill 00:06:44.836 ************************************ 00:06:44.836 15:45:50 accel -- accel/accel.sh@105 -- # run_test accel_copy_crc32c accel_test -t 1 -w copy_crc32c -y 00:06:44.836 15:45:50 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:44.836 15:45:50 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:44.836 15:45:50 accel -- common/autotest_common.sh@10 -- # set +x 00:06:44.836 ************************************ 00:06:44.836 START TEST accel_copy_crc32c 00:06:44.836 ************************************ 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@16 -- # local accel_opc 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@17 -- # local accel_module 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@12 -- # build_accel_config 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@40 -- # local IFS=, 00:06:44.836 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@41 -- # jq -r . 00:06:44.836 [2024-06-10 15:45:50.115699] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:44.836 [2024-06-10 15:45:50.115734] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2599983 ] 00:06:44.836 [2024-06-10 15:45:50.199993] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:44.836 [2024-06-10 15:45:50.294251] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0x1 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=0 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=software 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@22 -- # accel_module=software 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=32 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=1 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val='1 seconds' 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val=Yes 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:45.096 15:45:50 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@20 -- # val= 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@21 -- # case "$var" in 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # IFS=: 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@19 -- # read -r var val 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:46.035 00:06:46.035 real 0m1.415s 00:06:46.035 user 0m1.276s 00:06:46.035 sys 0m0.139s 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:46.035 15:45:51 accel.accel_copy_crc32c -- common/autotest_common.sh@10 -- # set +x 00:06:46.035 ************************************ 00:06:46.035 END TEST accel_copy_crc32c 00:06:46.035 ************************************ 00:06:46.294 15:45:51 accel -- accel/accel.sh@106 -- # run_test accel_copy_crc32c_C2 accel_test -t 1 -w copy_crc32c -y -C 2 00:06:46.294 15:45:51 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:46.294 15:45:51 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:46.294 15:45:51 accel -- common/autotest_common.sh@10 -- # set +x 00:06:46.294 ************************************ 00:06:46.294 START TEST accel_copy_crc32c_C2 00:06:46.294 ************************************ 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w copy_crc32c -y -C 2 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@16 -- # local accel_opc 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@17 -- # local accel_module 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@15 -- # accel_perf -t 1 -w copy_crc32c -y -C 2 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w copy_crc32c -y -C 2 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@12 -- # build_accel_config 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@40 -- # local IFS=, 00:06:46.294 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@41 -- # jq -r . 00:06:46.294 [2024-06-10 15:45:51.609059] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:46.294 [2024-06-10 15:45:51.609109] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2600230 ] 00:06:46.294 [2024-06-10 15:45:51.706568] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:46.294 [2024-06-10 15:45:51.797008] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0x1 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=copy_crc32c 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@23 -- # accel_opc=copy_crc32c 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=0 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='8192 bytes' 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=software 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@22 -- # accel_module=software 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=32 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=1 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val='1 seconds' 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val=Yes 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:46.554 15:45:51 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ -n copy_crc32c ]] 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:47.934 00:06:47.934 real 0m1.432s 00:06:47.934 user 0m1.283s 00:06:47.934 sys 0m0.155s 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:47.934 15:45:53 accel.accel_copy_crc32c_C2 -- common/autotest_common.sh@10 -- # set +x 00:06:47.934 ************************************ 00:06:47.934 END TEST accel_copy_crc32c_C2 00:06:47.934 ************************************ 00:06:47.934 15:45:53 accel -- accel/accel.sh@107 -- # run_test accel_dualcast accel_test -t 1 -w dualcast -y 00:06:47.934 15:45:53 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:47.934 15:45:53 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:47.934 15:45:53 accel -- common/autotest_common.sh@10 -- # set +x 00:06:47.934 ************************************ 00:06:47.934 START TEST accel_dualcast 00:06:47.934 ************************************ 00:06:47.934 15:45:53 accel.accel_dualcast -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dualcast -y 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@16 -- # local accel_opc 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@17 -- # local accel_module 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@15 -- # accel_perf -t 1 -w dualcast -y 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dualcast -y 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@12 -- # build_accel_config 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@40 -- # local IFS=, 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@41 -- # jq -r . 00:06:47.934 [2024-06-10 15:45:53.107774] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:47.934 [2024-06-10 15:45:53.107826] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2600479 ] 00:06:47.934 [2024-06-10 15:45:53.203868] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:47.934 [2024-06-10 15:45:53.293870] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=0x1 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=dualcast 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@23 -- # accel_opc=dualcast 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.934 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=software 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@22 -- # accel_module=software 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=32 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=1 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val='1 seconds' 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val=Yes 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:47.935 15:45:53 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.314 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@20 -- # val= 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@21 -- # case "$var" in 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ -n dualcast ]] 00:06:49.315 15:45:54 accel.accel_dualcast -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:49.315 00:06:49.315 real 0m1.436s 00:06:49.315 user 0m1.293s 00:06:49.315 sys 0m0.145s 00:06:49.315 15:45:54 accel.accel_dualcast -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:49.315 15:45:54 accel.accel_dualcast -- common/autotest_common.sh@10 -- # set +x 00:06:49.315 ************************************ 00:06:49.315 END TEST accel_dualcast 00:06:49.315 ************************************ 00:06:49.315 15:45:54 accel -- accel/accel.sh@108 -- # run_test accel_compare accel_test -t 1 -w compare -y 00:06:49.315 15:45:54 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:49.315 15:45:54 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:49.315 15:45:54 accel -- common/autotest_common.sh@10 -- # set +x 00:06:49.315 ************************************ 00:06:49.315 START TEST accel_compare 00:06:49.315 ************************************ 00:06:49.315 15:45:54 accel.accel_compare -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compare -y 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@16 -- # local accel_opc 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@17 -- # local accel_module 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@15 -- # accel_perf -t 1 -w compare -y 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compare -y 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@12 -- # build_accel_config 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@40 -- # local IFS=, 00:06:49.315 15:45:54 accel.accel_compare -- accel/accel.sh@41 -- # jq -r . 00:06:49.315 [2024-06-10 15:45:54.607804] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:49.315 [2024-06-10 15:45:54.607855] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2600725 ] 00:06:49.315 [2024-06-10 15:45:54.705858] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:49.315 [2024-06-10 15:45:54.797848] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=0x1 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=compare 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@23 -- # accel_opc=compare 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.574 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=software 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@22 -- # accel_module=software 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=32 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=1 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val='1 seconds' 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val=Yes 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:49.575 15:45:54 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@20 -- # val= 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@21 -- # case "$var" in 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # IFS=: 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@19 -- # read -r var val 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ -n compare ]] 00:06:50.510 15:45:56 accel.accel_compare -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:50.510 00:06:50.510 real 0m1.435s 00:06:50.510 user 0m1.293s 00:06:50.510 sys 0m0.147s 00:06:50.510 15:45:56 accel.accel_compare -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:50.510 15:45:56 accel.accel_compare -- common/autotest_common.sh@10 -- # set +x 00:06:50.510 ************************************ 00:06:50.510 END TEST accel_compare 00:06:50.510 ************************************ 00:06:50.769 15:45:56 accel -- accel/accel.sh@109 -- # run_test accel_xor accel_test -t 1 -w xor -y 00:06:50.769 15:45:56 accel -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:06:50.769 15:45:56 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:50.769 15:45:56 accel -- common/autotest_common.sh@10 -- # set +x 00:06:50.769 ************************************ 00:06:50.769 START TEST accel_xor 00:06:50.769 ************************************ 00:06:50.769 15:45:56 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:50.769 15:45:56 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:50.769 [2024-06-10 15:45:56.102271] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:50.769 [2024-06-10 15:45:56.102323] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2600974 ] 00:06:50.769 [2024-06-10 15:45:56.202704] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:51.029 [2024-06-10 15:45:56.291935] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=2 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:51.029 15:45:56 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:52.407 00:06:52.407 real 0m1.448s 00:06:52.407 user 0m1.285s 00:06:52.407 sys 0m0.158s 00:06:52.407 15:45:57 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:52.407 15:45:57 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:52.407 ************************************ 00:06:52.407 END TEST accel_xor 00:06:52.407 ************************************ 00:06:52.407 15:45:57 accel -- accel/accel.sh@110 -- # run_test accel_xor accel_test -t 1 -w xor -y -x 3 00:06:52.407 15:45:57 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:52.407 15:45:57 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:52.407 15:45:57 accel -- common/autotest_common.sh@10 -- # set +x 00:06:52.407 ************************************ 00:06:52.407 START TEST accel_xor 00:06:52.407 ************************************ 00:06:52.407 15:45:57 accel.accel_xor -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w xor -y -x 3 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@16 -- # local accel_opc 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@17 -- # local accel_module 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@15 -- # accel_perf -t 1 -w xor -y -x 3 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w xor -y -x 3 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@12 -- # build_accel_config 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@40 -- # local IFS=, 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@41 -- # jq -r . 00:06:52.407 [2024-06-10 15:45:57.612696] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:52.407 [2024-06-10 15:45:57.612751] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2601222 ] 00:06:52.407 [2024-06-10 15:45:57.710061] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:52.407 [2024-06-10 15:45:57.801121] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=0x1 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=xor 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@23 -- # accel_opc=xor 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=3 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.407 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=software 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@22 -- # accel_module=software 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=32 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=1 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val='1 seconds' 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val=Yes 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:52.408 15:45:57 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@20 -- # val= 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@21 -- # case "$var" in 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # IFS=: 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@19 -- # read -r var val 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ -n xor ]] 00:06:53.784 15:45:59 accel.accel_xor -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:53.784 00:06:53.784 real 0m1.433s 00:06:53.784 user 0m1.290s 00:06:53.784 sys 0m0.147s 00:06:53.784 15:45:59 accel.accel_xor -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:53.784 15:45:59 accel.accel_xor -- common/autotest_common.sh@10 -- # set +x 00:06:53.784 ************************************ 00:06:53.784 END TEST accel_xor 00:06:53.784 ************************************ 00:06:53.784 15:45:59 accel -- accel/accel.sh@111 -- # run_test accel_dif_verify accel_test -t 1 -w dif_verify 00:06:53.784 15:45:59 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:53.784 15:45:59 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:53.784 15:45:59 accel -- common/autotest_common.sh@10 -- # set +x 00:06:53.784 ************************************ 00:06:53.784 START TEST accel_dif_verify 00:06:53.785 ************************************ 00:06:53.785 15:45:59 accel.accel_dif_verify -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_verify 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@16 -- # local accel_opc 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@17 -- # local accel_module 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_verify 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_verify 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@12 -- # build_accel_config 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@40 -- # local IFS=, 00:06:53.785 15:45:59 accel.accel_dif_verify -- accel/accel.sh@41 -- # jq -r . 00:06:53.785 [2024-06-10 15:45:59.113156] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:53.785 [2024-06-10 15:45:59.113208] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2601531 ] 00:06:53.785 [2024-06-10 15:45:59.211309] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:54.044 [2024-06-10 15:45:59.303287] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=0x1 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=dif_verify 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@23 -- # accel_opc=dif_verify 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='512 bytes' 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='8 bytes' 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=software 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@22 -- # accel_module=software 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=32 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=1 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val='1 seconds' 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val=No 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:54.044 15:45:59 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ -n dif_verify ]] 00:06:55.420 15:46:00 accel.accel_dif_verify -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:55.420 00:06:55.420 real 0m1.445s 00:06:55.420 user 0m1.293s 00:06:55.420 sys 0m0.154s 00:06:55.420 15:46:00 accel.accel_dif_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:55.420 15:46:00 accel.accel_dif_verify -- common/autotest_common.sh@10 -- # set +x 00:06:55.420 ************************************ 00:06:55.420 END TEST accel_dif_verify 00:06:55.420 ************************************ 00:06:55.420 15:46:00 accel -- accel/accel.sh@112 -- # run_test accel_dif_generate accel_test -t 1 -w dif_generate 00:06:55.420 15:46:00 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:55.420 15:46:00 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:55.420 15:46:00 accel -- common/autotest_common.sh@10 -- # set +x 00:06:55.420 ************************************ 00:06:55.420 START TEST accel_dif_generate 00:06:55.420 ************************************ 00:06:55.420 15:46:00 accel.accel_dif_generate -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@16 -- # local accel_opc 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@17 -- # local accel_module 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@12 -- # build_accel_config 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@40 -- # local IFS=, 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@41 -- # jq -r . 00:06:55.420 [2024-06-10 15:46:00.628847] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:55.420 [2024-06-10 15:46:00.628899] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2601854 ] 00:06:55.420 [2024-06-10 15:46:00.727629] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:55.420 [2024-06-10 15:46:00.818677] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=0x1 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=dif_generate 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@23 -- # accel_opc=dif_generate 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.420 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='512 bytes' 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='8 bytes' 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=software 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@22 -- # accel_module=software 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=32 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=1 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val='1 seconds' 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val=No 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:55.421 15:46:00 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@20 -- # val= 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@21 -- # case "$var" in 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ -n dif_generate ]] 00:06:56.805 15:46:02 accel.accel_dif_generate -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:56.805 00:06:56.805 real 0m1.440s 00:06:56.805 user 0m1.298s 00:06:56.805 sys 0m0.148s 00:06:56.805 15:46:02 accel.accel_dif_generate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:56.805 15:46:02 accel.accel_dif_generate -- common/autotest_common.sh@10 -- # set +x 00:06:56.805 ************************************ 00:06:56.805 END TEST accel_dif_generate 00:06:56.805 ************************************ 00:06:56.805 15:46:02 accel -- accel/accel.sh@113 -- # run_test accel_dif_generate_copy accel_test -t 1 -w dif_generate_copy 00:06:56.805 15:46:02 accel -- common/autotest_common.sh@1100 -- # '[' 6 -le 1 ']' 00:06:56.805 15:46:02 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:56.805 15:46:02 accel -- common/autotest_common.sh@10 -- # set +x 00:06:56.805 ************************************ 00:06:56.805 START TEST accel_dif_generate_copy 00:06:56.805 ************************************ 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w dif_generate_copy 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@16 -- # local accel_opc 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@17 -- # local accel_module 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@15 -- # accel_perf -t 1 -w dif_generate_copy 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w dif_generate_copy 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@12 -- # build_accel_config 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@40 -- # local IFS=, 00:06:56.805 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@41 -- # jq -r . 00:06:56.805 [2024-06-10 15:46:02.130901] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:56.805 [2024-06-10 15:46:02.130969] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2602170 ] 00:06:56.805 [2024-06-10 15:46:02.227004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:57.064 [2024-06-10 15:46:02.318556] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=0x1 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=dif_generate_copy 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@23 -- # accel_opc=dif_generate_copy 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=software 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@22 -- # accel_module=software 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=32 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=1 00:06:57.064 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val='1 seconds' 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val=No 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:57.065 15:46:02 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@20 -- # val= 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@21 -- # case "$var" in 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ -n dif_generate_copy ]] 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:58.442 00:06:58.442 real 0m1.430s 00:06:58.442 user 0m1.288s 00:06:58.442 sys 0m0.145s 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:58.442 15:46:03 accel.accel_dif_generate_copy -- common/autotest_common.sh@10 -- # set +x 00:06:58.442 ************************************ 00:06:58.442 END TEST accel_dif_generate_copy 00:06:58.442 ************************************ 00:06:58.442 15:46:03 accel -- accel/accel.sh@115 -- # [[ y == y ]] 00:06:58.442 15:46:03 accel -- accel/accel.sh@116 -- # run_test accel_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.442 15:46:03 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:06:58.442 15:46:03 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:58.442 15:46:03 accel -- common/autotest_common.sh@10 -- # set +x 00:06:58.442 ************************************ 00:06:58.442 START TEST accel_comp 00:06:58.442 ************************************ 00:06:58.442 15:46:03 accel.accel_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.442 15:46:03 accel.accel_comp -- accel/accel.sh@16 -- # local accel_opc 00:06:58.442 15:46:03 accel.accel_comp -- accel/accel.sh@17 -- # local accel_module 00:06:58.442 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.442 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.442 15:46:03 accel.accel_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@12 -- # build_accel_config 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@40 -- # local IFS=, 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@41 -- # jq -r . 00:06:58.443 [2024-06-10 15:46:03.633078] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:58.443 [2024-06-10 15:46:03.633130] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2602416 ] 00:06:58.443 [2024-06-10 15:46:03.730373] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:58.443 [2024-06-10 15:46:03.820367] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=0x1 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=compress 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=software 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@22 -- # accel_module=software 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=32 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=1 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val=No 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:58.443 15:46:03 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@20 -- # val= 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@21 -- # case "$var" in 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n software ]] 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:06:59.821 15:46:05 accel.accel_comp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:06:59.821 00:06:59.821 real 0m1.439s 00:06:59.821 user 0m1.296s 00:06:59.821 sys 0m0.148s 00:06:59.821 15:46:05 accel.accel_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:06:59.821 15:46:05 accel.accel_comp -- common/autotest_common.sh@10 -- # set +x 00:06:59.821 ************************************ 00:06:59.821 END TEST accel_comp 00:06:59.821 ************************************ 00:06:59.821 15:46:05 accel -- accel/accel.sh@117 -- # run_test accel_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:59.821 15:46:05 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:06:59.821 15:46:05 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:06:59.821 15:46:05 accel -- common/autotest_common.sh@10 -- # set +x 00:06:59.821 ************************************ 00:06:59.821 START TEST accel_decomp 00:06:59.821 ************************************ 00:06:59.821 15:46:05 accel.accel_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@16 -- # local accel_opc 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@17 -- # local accel_module 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@12 -- # build_accel_config 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@36 -- # [[ -n '' ]] 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@40 -- # local IFS=, 00:06:59.821 15:46:05 accel.accel_decomp -- accel/accel.sh@41 -- # jq -r . 00:06:59.821 [2024-06-10 15:46:05.138561] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:06:59.821 [2024-06-10 15:46:05.138612] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2602668 ] 00:06:59.821 [2024-06-10 15:46:05.235388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:06:59.821 [2024-06-10 15:46:05.324953] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=software 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@22 -- # accel_module=software 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.107 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=32 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=1 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:00.108 15:46:05 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@20 -- # val= 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:01.047 15:46:06 accel.accel_decomp -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:01.047 00:07:01.047 real 0m1.432s 00:07:01.047 user 0m1.291s 00:07:01.047 sys 0m0.146s 00:07:01.047 15:46:06 accel.accel_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:01.047 15:46:06 accel.accel_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:01.047 ************************************ 00:07:01.047 END TEST accel_decomp 00:07:01.047 ************************************ 00:07:01.306 15:46:06 accel -- accel/accel.sh@118 -- # run_test accel_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:01.306 15:46:06 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:01.306 15:46:06 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:01.306 15:46:06 accel -- common/autotest_common.sh@10 -- # set +x 00:07:01.306 ************************************ 00:07:01.306 START TEST accel_decomp_full 00:07:01.306 ************************************ 00:07:01.306 15:46:06 accel.accel_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:01.306 15:46:06 accel.accel_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:01.306 [2024-06-10 15:46:06.636672] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:01.306 [2024-06-10 15:46:06.636725] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2602917 ] 00:07:01.306 [2024-06-10 15:46:06.732713] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:01.566 [2024-06-10 15:46:06.824482] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=software 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@22 -- # accel_module=software 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:01.566 15:46:06 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@20 -- # val= 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:02.946 15:46:08 accel.accel_decomp_full -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:02.946 00:07:02.946 real 0m1.449s 00:07:02.946 user 0m1.310s 00:07:02.946 sys 0m0.147s 00:07:02.946 15:46:08 accel.accel_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:02.946 15:46:08 accel.accel_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:02.947 ************************************ 00:07:02.947 END TEST accel_decomp_full 00:07:02.947 ************************************ 00:07:02.947 15:46:08 accel -- accel/accel.sh@119 -- # run_test accel_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:02.947 15:46:08 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:02.947 15:46:08 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:02.947 15:46:08 accel -- common/autotest_common.sh@10 -- # set +x 00:07:02.947 ************************************ 00:07:02.947 START TEST accel_decomp_mcore 00:07:02.947 ************************************ 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:02.947 [2024-06-10 15:46:08.150473] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:02.947 [2024-06-10 15:46:08.150523] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603165 ] 00:07:02.947 [2024-06-10 15:46:08.247524] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:02.947 [2024-06-10 15:46:08.342545] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:02.947 [2024-06-10 15:46:08.342643] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:02.947 [2024-06-10 15:46:08.342753] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:02.947 [2024-06-10 15:46:08.342754] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=software 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:02.947 15:46:08 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.326 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:04.327 00:07:04.327 real 0m1.444s 00:07:04.327 user 0m4.670s 00:07:04.327 sys 0m0.164s 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:04.327 15:46:09 accel.accel_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:04.327 ************************************ 00:07:04.327 END TEST accel_decomp_mcore 00:07:04.327 ************************************ 00:07:04.327 15:46:09 accel -- accel/accel.sh@120 -- # run_test accel_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:04.327 15:46:09 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:04.327 15:46:09 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:04.327 15:46:09 accel -- common/autotest_common.sh@10 -- # set +x 00:07:04.327 ************************************ 00:07:04.327 START TEST accel_decomp_full_mcore 00:07:04.327 ************************************ 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:04.327 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:04.327 [2024-06-10 15:46:09.663237] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:04.327 [2024-06-10 15:46:09.663293] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603413 ] 00:07:04.327 [2024-06-10 15:46:09.752435] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:04.586 [2024-06-10 15:46:09.847733] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:04.586 [2024-06-10 15:46:09.847828] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:04.586 [2024-06-10 15:46:09.847937] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:04.586 [2024-06-10 15:46:09.847938] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.586 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=software 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=software 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:04.587 15:46:09 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.966 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:05.967 00:07:05.967 real 0m1.459s 00:07:05.967 user 0m4.750s 00:07:05.967 sys 0m0.159s 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:05.967 15:46:11 accel.accel_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:05.967 ************************************ 00:07:05.967 END TEST accel_decomp_full_mcore 00:07:05.967 ************************************ 00:07:05.967 15:46:11 accel -- accel/accel.sh@121 -- # run_test accel_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:05.967 15:46:11 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:05.967 15:46:11 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:05.967 15:46:11 accel -- common/autotest_common.sh@10 -- # set +x 00:07:05.967 ************************************ 00:07:05.967 START TEST accel_decomp_mthread 00:07:05.967 ************************************ 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:05.967 [2024-06-10 15:46:11.159835] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:05.967 [2024-06-10 15:46:11.159870] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603667 ] 00:07:05.967 [2024-06-10 15:46:11.242241] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:05.967 [2024-06-10 15:46:11.333389] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=software 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:05.967 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:05.968 15:46:11 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:07.346 00:07:07.346 real 0m1.415s 00:07:07.346 user 0m1.283s 00:07:07.346 sys 0m0.135s 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:07.346 15:46:12 accel.accel_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:07.346 ************************************ 00:07:07.346 END TEST accel_decomp_mthread 00:07:07.346 ************************************ 00:07:07.346 15:46:12 accel -- accel/accel.sh@122 -- # run_test accel_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:07.346 15:46:12 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:07.346 15:46:12 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:07.346 15:46:12 accel -- common/autotest_common.sh@10 -- # set +x 00:07:07.346 ************************************ 00:07:07.346 START TEST accel_decomp_full_mthread 00:07:07.346 ************************************ 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:07.346 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:07.346 [2024-06-10 15:46:12.654052] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:07.346 [2024-06-10 15:46:12.654104] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2603911 ] 00:07:07.346 [2024-06-10 15:46:12.742315] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:07.346 [2024-06-10 15:46:12.833756] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=software 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=software 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:07.606 15:46:12 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n software ]] 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- accel/accel.sh@27 -- # [[ software == \s\o\f\t\w\a\r\e ]] 00:07:08.984 00:07:08.984 real 0m1.479s 00:07:08.984 user 0m1.330s 00:07:08.984 sys 0m0.152s 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:08.984 15:46:14 accel.accel_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:08.984 ************************************ 00:07:08.984 END TEST accel_decomp_full_mthread 00:07:08.984 ************************************ 00:07:08.984 15:46:14 accel -- accel/accel.sh@124 -- # [[ y == y ]] 00:07:08.984 15:46:14 accel -- accel/accel.sh@125 -- # COMPRESSDEV=1 00:07:08.984 15:46:14 accel -- accel/accel.sh@126 -- # get_expected_opcs 00:07:08.984 15:46:14 accel -- accel/accel.sh@60 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:08.984 15:46:14 accel -- accel/accel.sh@62 -- # spdk_tgt_pid=2604169 00:07:08.984 15:46:14 accel -- accel/accel.sh@63 -- # waitforlisten 2604169 00:07:08.984 15:46:14 accel -- common/autotest_common.sh@830 -- # '[' -z 2604169 ']' 00:07:08.985 15:46:14 accel -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:08.985 15:46:14 accel -- accel/accel.sh@61 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt -c /dev/fd/63 00:07:08.985 15:46:14 accel -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:08.985 15:46:14 accel -- accel/accel.sh@61 -- # build_accel_config 00:07:08.985 15:46:14 accel -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:08.985 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:08.985 15:46:14 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:08.985 15:46:14 accel -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:08.985 15:46:14 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:08.985 15:46:14 accel -- common/autotest_common.sh@10 -- # set +x 00:07:08.985 15:46:14 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:08.985 15:46:14 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:08.985 15:46:14 accel -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:08.985 15:46:14 accel -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:08.985 15:46:14 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:08.985 15:46:14 accel -- accel/accel.sh@41 -- # jq -r . 00:07:08.985 [2024-06-10 15:46:14.200638] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:08.985 [2024-06-10 15:46:14.200697] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604169 ] 00:07:08.985 [2024-06-10 15:46:14.299031] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:08.985 [2024-06-10 15:46:14.395269] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:09.553 [2024-06-10 15:46:14.953368] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@863 -- # return 0 00:07:09.812 15:46:15 accel -- accel/accel.sh@65 -- # [[ 0 -gt 0 ]] 00:07:09.812 15:46:15 accel -- accel/accel.sh@66 -- # [[ 0 -gt 0 ]] 00:07:09.812 15:46:15 accel -- accel/accel.sh@67 -- # [[ 0 -gt 0 ]] 00:07:09.812 15:46:15 accel -- accel/accel.sh@68 -- # [[ -n 1 ]] 00:07:09.812 15:46:15 accel -- accel/accel.sh@68 -- # check_save_config compressdev_scan_accel_module 00:07:09.812 15:46:15 accel -- accel/accel.sh@56 -- # rpc_cmd save_config 00:07:09.812 15:46:15 accel -- accel/accel.sh@56 -- # jq -r '.subsystems[] | select(.subsystem=="accel").config[]' 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:09.812 15:46:15 accel -- accel/accel.sh@56 -- # grep compressdev_scan_accel_module 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:09.812 "method": "compressdev_scan_accel_module", 00:07:09.812 15:46:15 accel -- accel/accel.sh@70 -- # exp_opcs=($($rpc_py accel_get_opc_assignments | jq -r ". | to_entries | map(\"\(.key)=\(.value)\") | .[]")) 00:07:09.812 15:46:15 accel -- accel/accel.sh@70 -- # rpc_cmd accel_get_opc_assignments 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:09.812 15:46:15 accel -- accel/accel.sh@70 -- # jq -r '. | to_entries | map("\(.key)=\(.value)") | .[]' 00:07:09.812 15:46:15 accel -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=dpdk_compressdev 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@71 -- # for opc_opt in "${exp_opcs[@]}" 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # IFS== 00:07:10.071 15:46:15 accel -- accel/accel.sh@72 -- # read -r opc module 00:07:10.071 15:46:15 accel -- accel/accel.sh@73 -- # expected_opcs["$opc"]=software 00:07:10.071 15:46:15 accel -- accel/accel.sh@75 -- # killprocess 2604169 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@949 -- # '[' -z 2604169 ']' 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@953 -- # kill -0 2604169 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@954 -- # uname 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2604169 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2604169' 00:07:10.071 killing process with pid 2604169 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@968 -- # kill 2604169 00:07:10.071 15:46:15 accel -- common/autotest_common.sh@973 -- # wait 2604169 00:07:10.330 15:46:15 accel -- accel/accel.sh@76 -- # trap - ERR 00:07:10.330 15:46:15 accel -- accel/accel.sh@127 -- # run_test accel_cdev_comp accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.330 15:46:15 accel -- common/autotest_common.sh@1100 -- # '[' 8 -le 1 ']' 00:07:10.330 15:46:15 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:10.330 15:46:15 accel -- common/autotest_common.sh@10 -- # set +x 00:07:10.330 ************************************ 00:07:10.330 START TEST accel_cdev_comp 00:07:10.330 ************************************ 00:07:10.330 15:46:15 accel.accel_cdev_comp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@16 -- # local accel_opc 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@17 -- # local accel_module 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@15 -- # accel_perf -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w compress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@12 -- # build_accel_config 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@40 -- # local IFS=, 00:07:10.330 15:46:15 accel.accel_cdev_comp -- accel/accel.sh@41 -- # jq -r . 00:07:10.330 [2024-06-10 15:46:15.824517] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:10.330 [2024-06-10 15:46:15.824566] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604580 ] 00:07:10.589 [2024-06-10 15:46:15.923896] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:10.589 [2024-06-10 15:46:16.013699] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:11.157 [2024-06-10 15:46:16.569541] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:11.157 [2024-06-10 15:46:16.571869] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23128d0 PMD being used: compress_qat 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.157 [2024-06-10 15:46:16.575816] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2317620 PMD being used: compress_qat 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=0x1 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=compress 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@23 -- # accel_opc=compress 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.157 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=32 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=1 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val=No 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:11.158 15:46:16 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@20 -- # val= 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@21 -- # case "$var" in 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ -n compress ]] 00:07:12.536 15:46:17 accel.accel_cdev_comp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:12.536 00:07:12.536 real 0m1.947s 00:07:12.536 user 0m1.542s 00:07:12.536 sys 0m0.408s 00:07:12.536 15:46:17 accel.accel_cdev_comp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:12.536 15:46:17 accel.accel_cdev_comp -- common/autotest_common.sh@10 -- # set +x 00:07:12.536 ************************************ 00:07:12.536 END TEST accel_cdev_comp 00:07:12.536 ************************************ 00:07:12.536 15:46:17 accel -- accel/accel.sh@128 -- # run_test accel_cdev_decomp accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:12.536 15:46:17 accel -- common/autotest_common.sh@1100 -- # '[' 9 -le 1 ']' 00:07:12.536 15:46:17 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:12.536 15:46:17 accel -- common/autotest_common.sh@10 -- # set +x 00:07:12.536 ************************************ 00:07:12.536 START TEST accel_cdev_decomp 00:07:12.536 ************************************ 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@16 -- # local accel_opc 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@17 -- # local accel_module 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:12.536 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@12 -- # build_accel_config 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@40 -- # local IFS=, 00:07:12.537 15:46:17 accel.accel_cdev_decomp -- accel/accel.sh@41 -- # jq -r . 00:07:12.537 [2024-06-10 15:46:17.824344] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:12.537 [2024-06-10 15:46:17.824394] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2604869 ] 00:07:12.537 [2024-06-10 15:46:17.923716] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:12.537 [2024-06-10 15:46:18.013806] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:13.109 [2024-06-10 15:46:18.574075] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:13.109 [2024-06-10 15:46:18.576375] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x15298d0 PMD being used: compress_qat 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 [2024-06-10 15:46:18.580438] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x152e620 PMD being used: compress_qat 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=0x1 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=decompress 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:13.109 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=32 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=1 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val='1 seconds' 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val=Yes 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:13.110 15:46:18 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@20 -- # val= 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@21 -- # case "$var" in 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:14.497 00:07:14.497 real 0m1.952s 00:07:14.497 user 0m1.563s 00:07:14.497 sys 0m0.388s 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:14.497 15:46:19 accel.accel_cdev_decomp -- common/autotest_common.sh@10 -- # set +x 00:07:14.497 ************************************ 00:07:14.497 END TEST accel_cdev_decomp 00:07:14.497 ************************************ 00:07:14.497 15:46:19 accel -- accel/accel.sh@129 -- # run_test accel_cdev_decomp_full accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.497 15:46:19 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:14.497 15:46:19 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:14.497 15:46:19 accel -- common/autotest_common.sh@10 -- # set +x 00:07:14.497 ************************************ 00:07:14.497 START TEST accel_cdev_decomp_full 00:07:14.497 ************************************ 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@16 -- # local accel_opc 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@17 -- # local accel_module 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@12 -- # build_accel_config 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@40 -- # local IFS=, 00:07:14.497 15:46:19 accel.accel_cdev_decomp_full -- accel/accel.sh@41 -- # jq -r . 00:07:14.497 [2024-06-10 15:46:19.833904] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:14.497 [2024-06-10 15:46:19.834020] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605226 ] 00:07:14.497 [2024-06-10 15:46:19.930754] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:14.756 [2024-06-10 15:46:20.027240] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:15.323 [2024-06-10 15:46:20.604221] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:15.323 [2024-06-10 15:46:20.606505] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fd78d0 PMD being used: compress_qat 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.323 [2024-06-10 15:46:20.609737] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1fd7970 PMD being used: compress_qat 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.323 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=0x1 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=decompress 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=32 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=1 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val='1 seconds' 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val=Yes 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:15.324 15:46:20 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.261 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.519 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@20 -- # val= 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@21 -- # case "$var" in 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # IFS=: 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@19 -- # read -r var val 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:16.520 00:07:16.520 real 0m1.964s 00:07:16.520 user 0m1.573s 00:07:16.520 sys 0m0.393s 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:16.520 15:46:21 accel.accel_cdev_decomp_full -- common/autotest_common.sh@10 -- # set +x 00:07:16.520 ************************************ 00:07:16.520 END TEST accel_cdev_decomp_full 00:07:16.520 ************************************ 00:07:16.520 15:46:21 accel -- accel/accel.sh@130 -- # run_test accel_cdev_decomp_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.520 15:46:21 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:16.520 15:46:21 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:16.520 15:46:21 accel -- common/autotest_common.sh@10 -- # set +x 00:07:16.520 ************************************ 00:07:16.520 START TEST accel_cdev_decomp_mcore 00:07:16.520 ************************************ 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -m 0xf 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:16.520 15:46:21 accel.accel_cdev_decomp_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:16.520 [2024-06-10 15:46:21.872585] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:16.520 [2024-06-10 15:46:21.872634] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605586 ] 00:07:16.520 [2024-06-10 15:46:21.969591] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:16.782 [2024-06-10 15:46:22.063410] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:16.782 [2024-06-10 15:46:22.063505] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:16.782 [2024-06-10 15:46:22.063612] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:16.782 [2024-06-10 15:46:22.063613] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:17.353 [2024-06-10 15:46:22.611494] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:17.353 [2024-06-10 15:46:22.613772] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x2288f40 PMD being used: compress_qat 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 [2024-06-10 15:46:22.619137] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9cd819b8b0 PMD being used: compress_qat 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:17.353 [2024-06-10 15:46:22.619511] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9cd019b8b0 PMD being used: compress_qat 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 [2024-06-10 15:46:22.620871] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x228e440 PMD being used: compress_qat 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 [2024-06-10 15:46:22.621137] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f9cc819b8b0 PMD being used: compress_qat 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=32 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.353 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=1 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:17.354 15:46:22 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.290 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@20 -- # val= 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:18.291 00:07:18.291 real 0m1.955s 00:07:18.291 user 0m6.482s 00:07:18.291 sys 0m0.389s 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:18.291 15:46:23 accel.accel_cdev_decomp_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:18.291 ************************************ 00:07:18.291 END TEST accel_cdev_decomp_mcore 00:07:18.291 ************************************ 00:07:18.550 15:46:23 accel -- accel/accel.sh@131 -- # run_test accel_cdev_decomp_full_mcore accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.550 15:46:23 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:18.550 15:46:23 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:18.550 15:46:23 accel -- common/autotest_common.sh@10 -- # set +x 00:07:18.550 ************************************ 00:07:18.550 START TEST accel_cdev_decomp_full_mcore 00:07:18.550 ************************************ 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@16 -- # local accel_opc 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@17 -- # local accel_module 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -m 0xf 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@12 -- # build_accel_config 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@40 -- # local IFS=, 00:07:18.550 15:46:23 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@41 -- # jq -r . 00:07:18.550 [2024-06-10 15:46:23.893450] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:18.550 [2024-06-10 15:46:23.893501] [ DPDK EAL parameters: accel_perf --no-shconf -c 0xf --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2605888 ] 00:07:18.550 [2024-06-10 15:46:23.991438] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 4 00:07:18.809 [2024-06-10 15:46:24.086409] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:18.809 [2024-06-10 15:46:24.086503] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:18.809 [2024-06-10 15:46:24.086614] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 3 00:07:18.810 [2024-06-10 15:46:24.086615] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:19.440 [2024-06-10 15:46:24.639156] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:19.440 [2024-06-10 15:46:24.641436] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x132af40 PMD being used: compress_qat 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 [2024-06-10 15:46:24.645929] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7faa3419b8b0 PMD being used: compress_qat 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=0xf 00:07:19.440 [2024-06-10 15:46:24.646317] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7faa2c19b8b0 PMD being used: compress_qat 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 [2024-06-10 15:46:24.647753] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x132afe0 PMD being used: compress_qat 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 [2024-06-10 15:46:24.647971] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7faa2419b8b0 PMD being used: compress_qat 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=decompress 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=32 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=1 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val='1 seconds' 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val=Yes 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:19.440 15:46:24 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@20 -- # val= 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@21 -- # case "$var" in 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # IFS=: 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@19 -- # read -r var val 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:20.378 00:07:20.378 real 0m1.963s 00:07:20.378 user 0m6.492s 00:07:20.378 sys 0m0.393s 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:20.378 15:46:25 accel.accel_cdev_decomp_full_mcore -- common/autotest_common.sh@10 -- # set +x 00:07:20.378 ************************************ 00:07:20.378 END TEST accel_cdev_decomp_full_mcore 00:07:20.378 ************************************ 00:07:20.378 15:46:25 accel -- accel/accel.sh@132 -- # run_test accel_cdev_decomp_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.378 15:46:25 accel -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:20.378 15:46:25 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:20.378 15:46:25 accel -- common/autotest_common.sh@10 -- # set +x 00:07:20.637 ************************************ 00:07:20.637 START TEST accel_cdev_decomp_mthread 00:07:20.637 ************************************ 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -T 2 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:20.637 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:20.638 15:46:25 accel.accel_cdev_decomp_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:20.638 [2024-06-10 15:46:25.924825] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:20.638 [2024-06-10 15:46:25.924877] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2606304 ] 00:07:20.638 [2024-06-10 15:46:26.025036] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:20.638 [2024-06-10 15:46:26.114666] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:21.206 [2024-06-10 15:46:26.678234] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:21.206 [2024-06-10 15:46:26.680575] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x238d8d0 PMD being used: compress_qat 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.206 [2024-06-10 15:46:26.685387] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x23929d0 PMD being used: compress_qat 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.206 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 [2024-06-10 15:46:26.687806] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x24b5810 PMD being used: compress_qat 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='4096 bytes' 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=32 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=2 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:21.207 15:46:26 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@20 -- # val= 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:22.584 00:07:22.584 real 0m1.958s 00:07:22.584 user 0m1.567s 00:07:22.584 sys 0m0.388s 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:22.584 15:46:27 accel.accel_cdev_decomp_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:22.584 ************************************ 00:07:22.584 END TEST accel_cdev_decomp_mthread 00:07:22.584 ************************************ 00:07:22.585 15:46:27 accel -- accel/accel.sh@133 -- # run_test accel_cdev_decomp_full_mthread accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.585 15:46:27 accel -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:07:22.585 15:46:27 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:22.585 15:46:27 accel -- common/autotest_common.sh@10 -- # set +x 00:07:22.585 ************************************ 00:07:22.585 START TEST accel_cdev_decomp_full_mthread 00:07:22.585 ************************************ 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1124 -- # accel_test -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@16 -- # local accel_opc 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@17 -- # local accel_module 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@15 -- # accel_perf -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/accel_perf -c /dev/fd/62 -t 1 -w decompress -l /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib -y -o 0 -T 2 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@12 -- # build_accel_config 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@36 -- # [[ -n 1 ]] 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@37 -- # accel_json_cfg+=('{"method": "compressdev_scan_accel_module", "params":{"pmd": 0}}') 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@40 -- # local IFS=, 00:07:22.585 15:46:27 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@41 -- # jq -r . 00:07:22.585 [2024-06-10 15:46:27.948866] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:22.585 [2024-06-10 15:46:27.948921] [ DPDK EAL parameters: accel_perf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2606565 ] 00:07:22.585 [2024-06-10 15:46:28.038543] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:22.844 [2024-06-10 15:46:28.129579] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:23.412 [2024-06-10 15:46:28.691068] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:07:23.412 [2024-06-10 15:46:28.693354] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a1b8d0 PMD being used: compress_qat 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 [2024-06-10 15:46:28.697293] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1a1b970 PMD being used: compress_qat 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=0x1 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 [2024-06-10 15:46:28.699995] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b43420 PMD being used: compress_qat 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=decompress 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@23 -- # accel_opc=decompress 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='111250 bytes' 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=dpdk_compressdev 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@22 -- # accel_module=dpdk_compressdev 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/bib 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=32 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=2 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val='1 seconds' 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val=Yes 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:23.412 15:46:28 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@20 -- # val= 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@21 -- # case "$var" in 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # IFS=: 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@19 -- # read -r var val 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n dpdk_compressdev ]] 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ -n decompress ]] 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- accel/accel.sh@27 -- # [[ dpdk_compressdev == \d\p\d\k\_\c\o\m\p\r\e\s\s\d\e\v ]] 00:07:24.790 00:07:24.790 real 0m1.950s 00:07:24.790 user 0m1.569s 00:07:24.790 sys 0m0.381s 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:24.790 15:46:29 accel.accel_cdev_decomp_full_mthread -- common/autotest_common.sh@10 -- # set +x 00:07:24.790 ************************************ 00:07:24.790 END TEST accel_cdev_decomp_full_mthread 00:07:24.790 ************************************ 00:07:24.790 15:46:29 accel -- accel/accel.sh@134 -- # unset COMPRESSDEV 00:07:24.790 15:46:29 accel -- accel/accel.sh@137 -- # run_test accel_dif_functional_tests /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:24.790 15:46:29 accel -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:07:24.790 15:46:29 accel -- accel/accel.sh@137 -- # build_accel_config 00:07:24.790 15:46:29 accel -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:24.790 15:46:29 accel -- accel/accel.sh@31 -- # accel_json_cfg=() 00:07:24.790 15:46:29 accel -- common/autotest_common.sh@10 -- # set +x 00:07:24.790 15:46:29 accel -- accel/accel.sh@32 -- # [[ 0 -gt 0 ]] 00:07:24.790 15:46:29 accel -- accel/accel.sh@33 -- # [[ 0 -gt 0 ]] 00:07:24.790 15:46:29 accel -- accel/accel.sh@34 -- # [[ 0 -gt 0 ]] 00:07:24.790 15:46:29 accel -- accel/accel.sh@36 -- # [[ -n '' ]] 00:07:24.790 15:46:29 accel -- accel/accel.sh@40 -- # local IFS=, 00:07:24.790 15:46:29 accel -- accel/accel.sh@41 -- # jq -r . 00:07:24.790 ************************************ 00:07:24.790 START TEST accel_dif_functional_tests 00:07:24.790 ************************************ 00:07:24.790 15:46:29 accel.accel_dif_functional_tests -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/dif/dif -c /dev/fd/62 00:07:24.790 [2024-06-10 15:46:29.983468] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:24.790 [2024-06-10 15:46:29.983519] [ DPDK EAL parameters: DIF --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607024 ] 00:07:24.790 [2024-06-10 15:46:30.084392] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:24.790 [2024-06-10 15:46:30.176913] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:24.790 [2024-06-10 15:46:30.177014] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:24.790 [2024-06-10 15:46:30.177019] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:24.790 00:07:24.790 00:07:24.790 CUnit - A unit testing framework for C - Version 2.1-3 00:07:24.790 http://cunit.sourceforge.net/ 00:07:24.790 00:07:24.790 00:07:24.790 Suite: accel_dif 00:07:24.790 Test: verify: DIF generated, GUARD check ...passed 00:07:24.790 Test: verify: DIF generated, APPTAG check ...passed 00:07:24.790 Test: verify: DIF generated, REFTAG check ...passed 00:07:24.790 Test: verify: DIF not generated, GUARD check ...[2024-06-10 15:46:30.267319] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:24.790 passed 00:07:24.790 Test: verify: DIF not generated, APPTAG check ...[2024-06-10 15:46:30.267382] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:24.790 passed 00:07:24.790 Test: verify: DIF not generated, REFTAG check ...[2024-06-10 15:46:30.267410] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:24.790 passed 00:07:24.790 Test: verify: APPTAG correct, APPTAG check ...passed 00:07:24.790 Test: verify: APPTAG incorrect, APPTAG check ...[2024-06-10 15:46:30.267471] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=30, Expected=28, Actual=14 00:07:24.790 passed 00:07:24.790 Test: verify: APPTAG incorrect, no APPTAG check ...passed 00:07:24.790 Test: verify: REFTAG incorrect, REFTAG ignore ...passed 00:07:24.790 Test: verify: REFTAG_INIT correct, REFTAG check ...passed 00:07:24.790 Test: verify: REFTAG_INIT incorrect, REFTAG check ...[2024-06-10 15:46:30.267617] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=10 00:07:24.790 passed 00:07:24.790 Test: verify copy: DIF generated, GUARD check ...passed 00:07:24.790 Test: verify copy: DIF generated, APPTAG check ...passed 00:07:24.790 Test: verify copy: DIF generated, REFTAG check ...passed 00:07:24.790 Test: verify copy: DIF not generated, GUARD check ...[2024-06-10 15:46:30.267777] dif.c: 826:_dif_verify: *ERROR*: Failed to compare Guard: LBA=10, Expected=5a5a, Actual=7867 00:07:24.790 passed 00:07:24.790 Test: verify copy: DIF not generated, APPTAG check ...[2024-06-10 15:46:30.267811] dif.c: 841:_dif_verify: *ERROR*: Failed to compare App Tag: LBA=10, Expected=14, Actual=5a5a 00:07:24.790 passed 00:07:24.790 Test: verify copy: DIF not generated, REFTAG check ...[2024-06-10 15:46:30.267841] dif.c: 776:_dif_reftag_check: *ERROR*: Failed to compare Ref Tag: LBA=10, Expected=a, Actual=5a5a5a5a 00:07:24.790 passed 00:07:24.790 Test: generate copy: DIF generated, GUARD check ...passed 00:07:24.791 Test: generate copy: DIF generated, APTTAG check ...passed 00:07:24.791 Test: generate copy: DIF generated, REFTAG check ...passed 00:07:24.791 Test: generate copy: DIF generated, no GUARD check flag set ...passed 00:07:24.791 Test: generate copy: DIF generated, no APPTAG check flag set ...passed 00:07:24.791 Test: generate copy: DIF generated, no REFTAG check flag set ...passed 00:07:24.791 Test: generate copy: iovecs-len validate ...[2024-06-10 15:46:30.268087] dif.c:1190:spdk_dif_generate_copy: *ERROR*: Size of bounce_iovs arrays are not valid or misaligned with block_size. 00:07:24.791 passed 00:07:24.791 Test: generate copy: buffer alignment validate ...passed 00:07:24.791 00:07:24.791 Run Summary: Type Total Ran Passed Failed Inactive 00:07:24.791 suites 1 1 n/a 0 0 00:07:24.791 tests 26 26 26 0 0 00:07:24.791 asserts 115 115 115 0 n/a 00:07:24.791 00:07:24.791 Elapsed time = 0.002 seconds 00:07:25.050 00:07:25.050 real 0m0.524s 00:07:25.050 user 0m0.750s 00:07:25.050 sys 0m0.171s 00:07:25.050 15:46:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:25.050 15:46:30 accel.accel_dif_functional_tests -- common/autotest_common.sh@10 -- # set +x 00:07:25.050 ************************************ 00:07:25.050 END TEST accel_dif_functional_tests 00:07:25.050 ************************************ 00:07:25.050 00:07:25.050 real 0m49.238s 00:07:25.050 user 0m59.171s 00:07:25.050 sys 0m8.788s 00:07:25.050 15:46:30 accel -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:25.050 15:46:30 accel -- common/autotest_common.sh@10 -- # set +x 00:07:25.050 ************************************ 00:07:25.050 END TEST accel 00:07:25.050 ************************************ 00:07:25.050 15:46:30 -- spdk/autotest.sh@184 -- # run_test accel_rpc /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:25.050 15:46:30 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:25.050 15:46:30 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:25.050 15:46:30 -- common/autotest_common.sh@10 -- # set +x 00:07:25.308 ************************************ 00:07:25.308 START TEST accel_rpc 00:07:25.308 ************************************ 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel/accel_rpc.sh 00:07:25.308 * Looking for test storage... 00:07:25.308 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/accel 00:07:25.308 15:46:30 accel_rpc -- accel/accel_rpc.sh@11 -- # trap 'killprocess $spdk_tgt_pid; exit 1' ERR 00:07:25.308 15:46:30 accel_rpc -- accel/accel_rpc.sh@14 -- # spdk_tgt_pid=2607100 00:07:25.308 15:46:30 accel_rpc -- accel/accel_rpc.sh@15 -- # waitforlisten 2607100 00:07:25.308 15:46:30 accel_rpc -- accel/accel_rpc.sh@13 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --wait-for-rpc 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@830 -- # '[' -z 2607100 ']' 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:25.308 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:25.308 15:46:30 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:25.308 [2024-06-10 15:46:30.711016] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:25.308 [2024-06-10 15:46:30.711084] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607100 ] 00:07:25.308 [2024-06-10 15:46:30.803058] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:25.566 [2024-06-10 15:46:30.897848] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@863 -- # return 0 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ y == y ]] 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@45 -- # [[ 0 -gt 0 ]] 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ y == y ]] 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@49 -- # [[ 0 -gt 0 ]] 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@53 -- # run_test accel_assign_opcode accel_assign_opcode_test_suite 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 ************************************ 00:07:26.502 START TEST accel_assign_opcode 00:07:26.502 ************************************ 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1124 -- # accel_assign_opcode_test_suite 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@38 -- # rpc_cmd accel_assign_opc -o copy -m incorrect 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 [2024-06-10 15:46:31.692293] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module incorrect 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@40 -- # rpc_cmd accel_assign_opc -o copy -m software 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 [2024-06-10 15:46:31.700304] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation copy will be assigned to module software 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@41 -- # rpc_cmd framework_start_init 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # rpc_cmd accel_get_opc_assignments 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # jq -r .copy 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- accel/accel_rpc.sh@42 -- # grep software 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:26.502 software 00:07:26.502 00:07:26.502 real 0m0.274s 00:07:26.502 user 0m0.052s 00:07:26.502 sys 0m0.010s 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:26.502 15:46:31 accel_rpc.accel_assign_opcode -- common/autotest_common.sh@10 -- # set +x 00:07:26.502 ************************************ 00:07:26.502 END TEST accel_assign_opcode 00:07:26.502 ************************************ 00:07:26.502 15:46:31 accel_rpc -- accel/accel_rpc.sh@55 -- # killprocess 2607100 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@949 -- # '[' -z 2607100 ']' 00:07:26.502 15:46:31 accel_rpc -- common/autotest_common.sh@953 -- # kill -0 2607100 00:07:26.503 15:46:31 accel_rpc -- common/autotest_common.sh@954 -- # uname 00:07:26.503 15:46:31 accel_rpc -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:26.503 15:46:31 accel_rpc -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2607100 00:07:26.771 15:46:32 accel_rpc -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:26.771 15:46:32 accel_rpc -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:26.771 15:46:32 accel_rpc -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2607100' 00:07:26.771 killing process with pid 2607100 00:07:26.771 15:46:32 accel_rpc -- common/autotest_common.sh@968 -- # kill 2607100 00:07:26.771 15:46:32 accel_rpc -- common/autotest_common.sh@973 -- # wait 2607100 00:07:27.034 00:07:27.034 real 0m1.819s 00:07:27.034 user 0m1.994s 00:07:27.034 sys 0m0.471s 00:07:27.034 15:46:32 accel_rpc -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:27.034 15:46:32 accel_rpc -- common/autotest_common.sh@10 -- # set +x 00:07:27.034 ************************************ 00:07:27.034 END TEST accel_rpc 00:07:27.034 ************************************ 00:07:27.034 15:46:32 -- spdk/autotest.sh@185 -- # run_test app_cmdline /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:27.034 15:46:32 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:27.034 15:46:32 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:27.034 15:46:32 -- common/autotest_common.sh@10 -- # set +x 00:07:27.034 ************************************ 00:07:27.034 START TEST app_cmdline 00:07:27.034 ************************************ 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/cmdline.sh 00:07:27.034 * Looking for test storage... 00:07:27.034 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:27.034 15:46:32 app_cmdline -- app/cmdline.sh@14 -- # trap 'killprocess $spdk_tgt_pid' EXIT 00:07:27.034 15:46:32 app_cmdline -- app/cmdline.sh@17 -- # spdk_tgt_pid=2607615 00:07:27.034 15:46:32 app_cmdline -- app/cmdline.sh@18 -- # waitforlisten 2607615 00:07:27.034 15:46:32 app_cmdline -- app/cmdline.sh@16 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt --rpcs-allowed spdk_get_version,rpc_get_methods 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@830 -- # '[' -z 2607615 ']' 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:27.034 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:27.034 15:46:32 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:27.292 [2024-06-10 15:46:32.597730] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:27.292 [2024-06-10 15:46:32.597792] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2607615 ] 00:07:27.292 [2024-06-10 15:46:32.695602] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:27.292 [2024-06-10 15:46:32.792858] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:28.229 15:46:33 app_cmdline -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:28.229 15:46:33 app_cmdline -- common/autotest_common.sh@863 -- # return 0 00:07:28.229 15:46:33 app_cmdline -- app/cmdline.sh@20 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py spdk_get_version 00:07:28.488 { 00:07:28.488 "version": "SPDK v24.09-pre git sha1 8d1bffc3d", 00:07:28.488 "fields": { 00:07:28.488 "major": 24, 00:07:28.488 "minor": 9, 00:07:28.489 "patch": 0, 00:07:28.489 "suffix": "-pre", 00:07:28.489 "commit": "8d1bffc3d" 00:07:28.489 } 00:07:28.489 } 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@22 -- # expected_methods=() 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@23 -- # expected_methods+=("rpc_get_methods") 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@24 -- # expected_methods+=("spdk_get_version") 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@26 -- # methods=($(rpc_cmd rpc_get_methods | jq -r ".[]" | sort)) 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@26 -- # rpc_cmd rpc_get_methods 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@26 -- # jq -r '.[]' 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@26 -- # sort 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@27 -- # (( 2 == 2 )) 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@28 -- # [[ rpc_get_methods spdk_get_version == \r\p\c\_\g\e\t\_\m\e\t\h\o\d\s\ \s\p\d\k\_\g\e\t\_\v\e\r\s\i\o\n ]] 00:07:28.489 15:46:33 app_cmdline -- app/cmdline.sh@30 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@649 -- # local es=0 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:07:28.489 15:46:33 app_cmdline -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py env_dpdk_get_mem_stats 00:07:28.747 request: 00:07:28.747 { 00:07:28.747 "method": "env_dpdk_get_mem_stats", 00:07:28.747 "req_id": 1 00:07:28.747 } 00:07:28.747 Got JSON-RPC error response 00:07:28.747 response: 00:07:28.747 { 00:07:28.747 "code": -32601, 00:07:28.747 "message": "Method not found" 00:07:28.747 } 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@652 -- # es=1 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:07:28.747 15:46:34 app_cmdline -- app/cmdline.sh@1 -- # killprocess 2607615 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@949 -- # '[' -z 2607615 ']' 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@953 -- # kill -0 2607615 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@954 -- # uname 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2607615 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2607615' 00:07:28.747 killing process with pid 2607615 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@968 -- # kill 2607615 00:07:28.747 15:46:34 app_cmdline -- common/autotest_common.sh@973 -- # wait 2607615 00:07:29.005 00:07:29.005 real 0m2.045s 00:07:29.005 user 0m2.591s 00:07:29.005 sys 0m0.519s 00:07:29.005 15:46:34 app_cmdline -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:29.005 15:46:34 app_cmdline -- common/autotest_common.sh@10 -- # set +x 00:07:29.005 ************************************ 00:07:29.005 END TEST app_cmdline 00:07:29.005 ************************************ 00:07:29.264 15:46:34 -- spdk/autotest.sh@186 -- # run_test version /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:29.264 15:46:34 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:29.264 15:46:34 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:29.264 15:46:34 -- common/autotest_common.sh@10 -- # set +x 00:07:29.264 ************************************ 00:07:29.264 START TEST version 00:07:29.264 ************************************ 00:07:29.264 15:46:34 version -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/version.sh 00:07:29.264 * Looking for test storage... 00:07:29.264 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:07:29.264 15:46:34 version -- app/version.sh@17 -- # get_header_version major 00:07:29.264 15:46:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MAJOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # cut -f2 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:29.264 15:46:34 version -- app/version.sh@17 -- # major=24 00:07:29.264 15:46:34 version -- app/version.sh@18 -- # get_header_version minor 00:07:29.264 15:46:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_MINOR[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # cut -f2 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:29.264 15:46:34 version -- app/version.sh@18 -- # minor=9 00:07:29.264 15:46:34 version -- app/version.sh@19 -- # get_header_version patch 00:07:29.264 15:46:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_PATCH[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # cut -f2 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:29.264 15:46:34 version -- app/version.sh@19 -- # patch=0 00:07:29.264 15:46:34 version -- app/version.sh@20 -- # get_header_version suffix 00:07:29.264 15:46:34 version -- app/version.sh@13 -- # grep -E '^#define SPDK_VERSION_SUFFIX[[:space:]]+' /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/version.h 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # tr -d '"' 00:07:29.264 15:46:34 version -- app/version.sh@14 -- # cut -f2 00:07:29.264 15:46:34 version -- app/version.sh@20 -- # suffix=-pre 00:07:29.264 15:46:34 version -- app/version.sh@22 -- # version=24.9 00:07:29.264 15:46:34 version -- app/version.sh@25 -- # (( patch != 0 )) 00:07:29.264 15:46:34 version -- app/version.sh@28 -- # version=24.9rc0 00:07:29.264 15:46:34 version -- app/version.sh@30 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:07:29.264 15:46:34 version -- app/version.sh@30 -- # python3 -c 'import spdk; print(spdk.__version__)' 00:07:29.264 15:46:34 version -- app/version.sh@30 -- # py_version=24.9rc0 00:07:29.264 15:46:34 version -- app/version.sh@31 -- # [[ 24.9rc0 == \2\4\.\9\r\c\0 ]] 00:07:29.264 00:07:29.264 real 0m0.158s 00:07:29.264 user 0m0.092s 00:07:29.264 sys 0m0.098s 00:07:29.264 15:46:34 version -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:29.264 15:46:34 version -- common/autotest_common.sh@10 -- # set +x 00:07:29.264 ************************************ 00:07:29.264 END TEST version 00:07:29.264 ************************************ 00:07:29.264 15:46:34 -- spdk/autotest.sh@188 -- # '[' 1 -eq 1 ']' 00:07:29.264 15:46:34 -- spdk/autotest.sh@189 -- # run_test blockdev_general /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:29.264 15:46:34 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:07:29.264 15:46:34 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:29.264 15:46:34 -- common/autotest_common.sh@10 -- # set +x 00:07:29.522 ************************************ 00:07:29.522 START TEST blockdev_general 00:07:29.522 ************************************ 00:07:29.522 15:46:34 blockdev_general -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh 00:07:29.522 * Looking for test storage... 00:07:29.522 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:29.522 15:46:34 blockdev_general -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:07:29.522 15:46:34 blockdev_general -- bdev/nbd_common.sh@6 -- # set -e 00:07:29.522 15:46:34 blockdev_general -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@20 -- # : 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@674 -- # uname -s 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@682 -- # test_type=bdev 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@683 -- # crypto_device= 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@684 -- # dek= 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@685 -- # env_ctx= 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@690 -- # [[ bdev == bdev ]] 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2608003 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@49 -- # waitforlisten 2608003 00:07:29.523 15:46:34 blockdev_general -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@830 -- # '[' -z 2608003 ']' 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:29.523 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:29.523 15:46:34 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:29.523 [2024-06-10 15:46:34.951947] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:29.523 [2024-06-10 15:46:34.952016] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608003 ] 00:07:29.782 [2024-06-10 15:46:35.051740] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:29.782 [2024-06-10 15:46:35.149494] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:30.719 15:46:35 blockdev_general -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:30.719 15:46:35 blockdev_general -- common/autotest_common.sh@863 -- # return 0 00:07:30.719 15:46:35 blockdev_general -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:07:30.719 15:46:35 blockdev_general -- bdev/blockdev.sh@696 -- # setup_bdev_conf 00:07:30.719 15:46:35 blockdev_general -- bdev/blockdev.sh@53 -- # rpc_cmd 00:07:30.719 15:46:35 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.719 15:46:35 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.719 [2024-06-10 15:46:36.129714] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:30.719 [2024-06-10 15:46:36.129769] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:30.719 00:07:30.719 [2024-06-10 15:46:36.137698] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:30.719 [2024-06-10 15:46:36.137721] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:30.719 00:07:30.719 Malloc0 00:07:30.719 Malloc1 00:07:30.719 Malloc2 00:07:30.719 Malloc3 00:07:30.719 Malloc4 00:07:30.719 Malloc5 00:07:30.978 Malloc6 00:07:30.978 Malloc7 00:07:30.978 Malloc8 00:07:30.978 Malloc9 00:07:30.978 [2024-06-10 15:46:36.273232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:30.978 [2024-06-10 15:46:36.273278] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:30.978 [2024-06-10 15:46:36.273294] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdf83a0 00:07:30.978 [2024-06-10 15:46:36.273303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:30.978 [2024-06-10 15:46:36.274739] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:30.978 [2024-06-10 15:46:36.274766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:30.978 TestPT 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile bs=2048 count=5000 00:07:30.978 5000+0 records in 00:07:30.978 5000+0 records out 00:07:30.978 10240000 bytes (10 MB, 9.8 MiB) copied, 0.017421 s, 588 MB/s 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@77 -- # rpc_cmd bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile AIO0 2048 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.978 AIO0 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@740 -- # cat 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:30.978 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:07:30.978 15:46:36 blockdev_general -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:07:30.979 15:46:36 blockdev_general -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:07:30.979 15:46:36 blockdev_general -- common/autotest_common.sh@560 -- # xtrace_disable 00:07:30.979 15:46:36 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:31.239 15:46:36 blockdev_general -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:07:31.239 15:46:36 blockdev_general -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:07:31.239 15:46:36 blockdev_general -- bdev/blockdev.sh@749 -- # jq -r .name 00:07:31.240 15:46:36 blockdev_general -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2c706d87-de1b-59dc-8209-c4e04762af52"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2c706d87-de1b-59dc-8209-c4e04762af52",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "52ba6ad6-c80d-5115-bce1-3544c2fe835a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "52ba6ad6-c80d-5115-bce1-3544c2fe835a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e0723f33-2174-5066-b830-959a10522ba9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0723f33-2174-5066-b830-959a10522ba9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "9eefeea6-1e23-54d2-b872-0ee3937d11e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9eefeea6-1e23-54d2-b872-0ee3937d11e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "fab681ea-e899-525d-a19e-458983bbac92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fab681ea-e899-525d-a19e-458983bbac92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "0e5efa97-7aea-5f4a-89be-9cafbbad6af7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0e5efa97-7aea-5f4a-89be-9cafbbad6af7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "33b14985-c68a-58e6-a058-db93c606f53b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "33b14985-c68a-58e6-a058-db93c606f53b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "043436c2-d82e-5478-b73f-b6d730ff0d4a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "043436c2-d82e-5478-b73f-b6d730ff0d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "917a38d6-a1af-4820-91f7-ac54a5b92219"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4f4ef910-7534-4851-af9f-f1e1887f1ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "513b55f0-736e-433a-aba1-8ff25a06c835",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d60d76a3-deb8-4c63-b6f7-c810b8b0f488"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0af73327-78cc-47df-9f47-d4318dbd77b5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "10421489-1baf-4333-9437-07f396cb9815",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "fc2a8214-740f-46f4-94c1-4feb26f228bc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "c7f05316-48e3-443a-a21d-cd4404f89ae3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e533cf7b-ba4d-47d6-96bc-86089a81efbf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0ea3f1f9-8469-464a-aea4-a557bece1444"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0ea3f1f9-8469-464a-aea4-a557bece1444",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:07:31.240 15:46:36 blockdev_general -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:07:31.240 15:46:36 blockdev_general -- bdev/blockdev.sh@752 -- # hello_world_bdev=Malloc0 00:07:31.240 15:46:36 blockdev_general -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:07:31.240 15:46:36 blockdev_general -- bdev/blockdev.sh@754 -- # killprocess 2608003 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@949 -- # '[' -z 2608003 ']' 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@953 -- # kill -0 2608003 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@954 -- # uname 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2608003 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2608003' 00:07:31.240 killing process with pid 2608003 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@968 -- # kill 2608003 00:07:31.240 15:46:36 blockdev_general -- common/autotest_common.sh@973 -- # wait 2608003 00:07:31.809 15:46:37 blockdev_general -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:07:31.809 15:46:37 blockdev_general -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:31.809 15:46:37 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:07:31.809 15:46:37 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:31.809 15:46:37 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:31.809 ************************************ 00:07:31.809 START TEST bdev_hello_world 00:07:31.809 ************************************ 00:07:31.809 15:46:37 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b Malloc0 '' 00:07:31.809 [2024-06-10 15:46:37.172257] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:31.809 [2024-06-10 15:46:37.172310] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608443 ] 00:07:31.809 [2024-06-10 15:46:37.272643] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:32.067 [2024-06-10 15:46:37.361119] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:32.067 [2024-06-10 15:46:37.509879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:32.067 [2024-06-10 15:46:37.509937] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:32.068 [2024-06-10 15:46:37.509949] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:32.068 [2024-06-10 15:46:37.517886] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:32.068 [2024-06-10 15:46:37.517912] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:32.068 [2024-06-10 15:46:37.525899] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:32.068 [2024-06-10 15:46:37.525923] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:32.327 [2024-06-10 15:46:37.597847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:32.327 [2024-06-10 15:46:37.597893] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:32.327 [2024-06-10 15:46:37.597908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1123f50 00:07:32.327 [2024-06-10 15:46:37.597917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:32.327 [2024-06-10 15:46:37.599389] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:32.327 [2024-06-10 15:46:37.599417] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:32.327 [2024-06-10 15:46:37.740441] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:07:32.327 [2024-06-10 15:46:37.740505] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev Malloc0 00:07:32.327 [2024-06-10 15:46:37.740551] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:07:32.327 [2024-06-10 15:46:37.740618] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:07:32.327 [2024-06-10 15:46:37.740686] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:07:32.327 [2024-06-10 15:46:37.740708] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:07:32.327 [2024-06-10 15:46:37.740765] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:07:32.327 00:07:32.327 [2024-06-10 15:46:37.740796] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:07:32.586 00:07:32.586 real 0m0.911s 00:07:32.586 user 0m0.616s 00:07:32.586 sys 0m0.257s 00:07:32.586 15:46:38 blockdev_general.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:32.586 15:46:38 blockdev_general.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:07:32.586 ************************************ 00:07:32.586 END TEST bdev_hello_world 00:07:32.586 ************************************ 00:07:32.586 15:46:38 blockdev_general -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:07:32.586 15:46:38 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:07:32.586 15:46:38 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:32.586 15:46:38 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:32.586 ************************************ 00:07:32.586 START TEST bdev_bounds 00:07:32.586 ************************************ 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2608686 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2608686' 00:07:32.845 Process bdevio pid: 2608686 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2608686 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 2608686 ']' 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:07:32.845 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:32.845 15:46:38 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:32.845 [2024-06-10 15:46:38.152558] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:32.845 [2024-06-10 15:46:38.152621] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2608686 ] 00:07:32.845 [2024-06-10 15:46:38.242016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:07:32.845 [2024-06-10 15:46:38.334676] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:07:32.845 [2024-06-10 15:46:38.334775] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:07:32.845 [2024-06-10 15:46:38.334778] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:33.105 [2024-06-10 15:46:38.476861] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:33.105 [2024-06-10 15:46:38.476917] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:33.105 [2024-06-10 15:46:38.476928] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:33.105 [2024-06-10 15:46:38.484872] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:33.105 [2024-06-10 15:46:38.484897] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:33.105 [2024-06-10 15:46:38.492888] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:33.105 [2024-06-10 15:46:38.492909] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:33.105 [2024-06-10 15:46:38.565173] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:33.105 [2024-06-10 15:46:38.565221] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:33.105 [2024-06-10 15:46:38.565234] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2351d20 00:07:33.105 [2024-06-10 15:46:38.565244] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:33.105 [2024-06-10 15:46:38.566760] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:33.105 [2024-06-10 15:46:38.566787] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:33.672 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:33.672 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:07:33.672 15:46:39 blockdev_general.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:07:33.932 I/O targets: 00:07:33.932 Malloc0: 65536 blocks of 512 bytes (32 MiB) 00:07:33.932 Malloc1p0: 32768 blocks of 512 bytes (16 MiB) 00:07:33.932 Malloc1p1: 32768 blocks of 512 bytes (16 MiB) 00:07:33.932 Malloc2p0: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p1: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p2: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p3: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p4: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p5: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p6: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 Malloc2p7: 8192 blocks of 512 bytes (4 MiB) 00:07:33.932 TestPT: 65536 blocks of 512 bytes (32 MiB) 00:07:33.932 raid0: 131072 blocks of 512 bytes (64 MiB) 00:07:33.932 concat0: 131072 blocks of 512 bytes (64 MiB) 00:07:33.932 raid1: 65536 blocks of 512 bytes (32 MiB) 00:07:33.932 AIO0: 5000 blocks of 2048 bytes (10 MiB) 00:07:33.932 00:07:33.932 00:07:33.932 CUnit - A unit testing framework for C - Version 2.1-3 00:07:33.932 http://cunit.sourceforge.net/ 00:07:33.932 00:07:33.932 00:07:33.932 Suite: bdevio tests on: AIO0 00:07:33.932 Test: blockdev write read block ...passed 00:07:33.932 Test: blockdev write zeroes read block ...passed 00:07:33.932 Test: blockdev write zeroes read no split ...passed 00:07:33.932 Test: blockdev write zeroes read split ...passed 00:07:33.932 Test: blockdev write zeroes read split partial ...passed 00:07:33.932 Test: blockdev reset ...passed 00:07:33.932 Test: blockdev write read 8 blocks ...passed 00:07:33.932 Test: blockdev write read size > 128k ...passed 00:07:33.932 Test: blockdev write read invalid size ...passed 00:07:33.932 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.932 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.932 Test: blockdev write read max offset ...passed 00:07:33.932 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.932 Test: blockdev writev readv 8 blocks ...passed 00:07:33.932 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.932 Test: blockdev writev readv block ...passed 00:07:33.932 Test: blockdev writev readv size > 128k ...passed 00:07:33.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.932 Test: blockdev comparev and writev ...passed 00:07:33.932 Test: blockdev nvme passthru rw ...passed 00:07:33.932 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.932 Test: blockdev nvme admin passthru ...passed 00:07:33.932 Test: blockdev copy ...passed 00:07:33.932 Suite: bdevio tests on: raid1 00:07:33.932 Test: blockdev write read block ...passed 00:07:33.932 Test: blockdev write zeroes read block ...passed 00:07:33.932 Test: blockdev write zeroes read no split ...passed 00:07:33.932 Test: blockdev write zeroes read split ...passed 00:07:33.932 Test: blockdev write zeroes read split partial ...passed 00:07:33.932 Test: blockdev reset ...passed 00:07:33.932 Test: blockdev write read 8 blocks ...passed 00:07:33.932 Test: blockdev write read size > 128k ...passed 00:07:33.932 Test: blockdev write read invalid size ...passed 00:07:33.932 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.932 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.932 Test: blockdev write read max offset ...passed 00:07:33.932 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.932 Test: blockdev writev readv 8 blocks ...passed 00:07:33.932 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.932 Test: blockdev writev readv block ...passed 00:07:33.932 Test: blockdev writev readv size > 128k ...passed 00:07:33.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.932 Test: blockdev comparev and writev ...passed 00:07:33.932 Test: blockdev nvme passthru rw ...passed 00:07:33.932 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.932 Test: blockdev nvme admin passthru ...passed 00:07:33.932 Test: blockdev copy ...passed 00:07:33.932 Suite: bdevio tests on: concat0 00:07:33.932 Test: blockdev write read block ...passed 00:07:33.932 Test: blockdev write zeroes read block ...passed 00:07:33.932 Test: blockdev write zeroes read no split ...passed 00:07:33.932 Test: blockdev write zeroes read split ...passed 00:07:33.932 Test: blockdev write zeroes read split partial ...passed 00:07:33.932 Test: blockdev reset ...passed 00:07:33.932 Test: blockdev write read 8 blocks ...passed 00:07:33.932 Test: blockdev write read size > 128k ...passed 00:07:33.932 Test: blockdev write read invalid size ...passed 00:07:33.932 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.932 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.932 Test: blockdev write read max offset ...passed 00:07:33.932 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.932 Test: blockdev writev readv 8 blocks ...passed 00:07:33.932 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.932 Test: blockdev writev readv block ...passed 00:07:33.932 Test: blockdev writev readv size > 128k ...passed 00:07:33.932 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.932 Test: blockdev comparev and writev ...passed 00:07:33.932 Test: blockdev nvme passthru rw ...passed 00:07:33.932 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: raid0 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: TestPT 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p7 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p6 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p5 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p4 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p3 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.933 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.933 Test: blockdev write read max offset ...passed 00:07:33.933 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.933 Test: blockdev writev readv 8 blocks ...passed 00:07:33.933 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.933 Test: blockdev writev readv block ...passed 00:07:33.933 Test: blockdev writev readv size > 128k ...passed 00:07:33.933 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.933 Test: blockdev comparev and writev ...passed 00:07:33.933 Test: blockdev nvme passthru rw ...passed 00:07:33.933 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.933 Test: blockdev nvme admin passthru ...passed 00:07:33.933 Test: blockdev copy ...passed 00:07:33.933 Suite: bdevio tests on: Malloc2p2 00:07:33.933 Test: blockdev write read block ...passed 00:07:33.933 Test: blockdev write zeroes read block ...passed 00:07:33.933 Test: blockdev write zeroes read no split ...passed 00:07:33.933 Test: blockdev write zeroes read split ...passed 00:07:33.933 Test: blockdev write zeroes read split partial ...passed 00:07:33.933 Test: blockdev reset ...passed 00:07:33.933 Test: blockdev write read 8 blocks ...passed 00:07:33.933 Test: blockdev write read size > 128k ...passed 00:07:33.933 Test: blockdev write read invalid size ...passed 00:07:33.933 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 Suite: bdevio tests on: Malloc2p1 00:07:33.934 Test: blockdev write read block ...passed 00:07:33.934 Test: blockdev write zeroes read block ...passed 00:07:33.934 Test: blockdev write zeroes read no split ...passed 00:07:33.934 Test: blockdev write zeroes read split ...passed 00:07:33.934 Test: blockdev write zeroes read split partial ...passed 00:07:33.934 Test: blockdev reset ...passed 00:07:33.934 Test: blockdev write read 8 blocks ...passed 00:07:33.934 Test: blockdev write read size > 128k ...passed 00:07:33.934 Test: blockdev write read invalid size ...passed 00:07:33.934 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 Suite: bdevio tests on: Malloc2p0 00:07:33.934 Test: blockdev write read block ...passed 00:07:33.934 Test: blockdev write zeroes read block ...passed 00:07:33.934 Test: blockdev write zeroes read no split ...passed 00:07:33.934 Test: blockdev write zeroes read split ...passed 00:07:33.934 Test: blockdev write zeroes read split partial ...passed 00:07:33.934 Test: blockdev reset ...passed 00:07:33.934 Test: blockdev write read 8 blocks ...passed 00:07:33.934 Test: blockdev write read size > 128k ...passed 00:07:33.934 Test: blockdev write read invalid size ...passed 00:07:33.934 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 Suite: bdevio tests on: Malloc1p1 00:07:33.934 Test: blockdev write read block ...passed 00:07:33.934 Test: blockdev write zeroes read block ...passed 00:07:33.934 Test: blockdev write zeroes read no split ...passed 00:07:33.934 Test: blockdev write zeroes read split ...passed 00:07:33.934 Test: blockdev write zeroes read split partial ...passed 00:07:33.934 Test: blockdev reset ...passed 00:07:33.934 Test: blockdev write read 8 blocks ...passed 00:07:33.934 Test: blockdev write read size > 128k ...passed 00:07:33.934 Test: blockdev write read invalid size ...passed 00:07:33.934 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 Suite: bdevio tests on: Malloc1p0 00:07:33.934 Test: blockdev write read block ...passed 00:07:33.934 Test: blockdev write zeroes read block ...passed 00:07:33.934 Test: blockdev write zeroes read no split ...passed 00:07:33.934 Test: blockdev write zeroes read split ...passed 00:07:33.934 Test: blockdev write zeroes read split partial ...passed 00:07:33.934 Test: blockdev reset ...passed 00:07:33.934 Test: blockdev write read 8 blocks ...passed 00:07:33.934 Test: blockdev write read size > 128k ...passed 00:07:33.934 Test: blockdev write read invalid size ...passed 00:07:33.934 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 Suite: bdevio tests on: Malloc0 00:07:33.934 Test: blockdev write read block ...passed 00:07:33.934 Test: blockdev write zeroes read block ...passed 00:07:33.934 Test: blockdev write zeroes read no split ...passed 00:07:33.934 Test: blockdev write zeroes read split ...passed 00:07:33.934 Test: blockdev write zeroes read split partial ...passed 00:07:33.934 Test: blockdev reset ...passed 00:07:33.934 Test: blockdev write read 8 blocks ...passed 00:07:33.934 Test: blockdev write read size > 128k ...passed 00:07:33.934 Test: blockdev write read invalid size ...passed 00:07:33.934 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:07:33.934 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:07:33.934 Test: blockdev write read max offset ...passed 00:07:33.934 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:07:33.934 Test: blockdev writev readv 8 blocks ...passed 00:07:33.934 Test: blockdev writev readv 30 x 1block ...passed 00:07:33.934 Test: blockdev writev readv block ...passed 00:07:33.934 Test: blockdev writev readv size > 128k ...passed 00:07:33.934 Test: blockdev writev readv size > 128k in two iovs ...passed 00:07:33.934 Test: blockdev comparev and writev ...passed 00:07:33.934 Test: blockdev nvme passthru rw ...passed 00:07:33.934 Test: blockdev nvme passthru vendor specific ...passed 00:07:33.934 Test: blockdev nvme admin passthru ...passed 00:07:33.934 Test: blockdev copy ...passed 00:07:33.934 00:07:33.934 Run Summary: Type Total Ran Passed Failed Inactive 00:07:33.934 suites 16 16 n/a 0 0 00:07:33.934 tests 368 368 368 0 0 00:07:33.934 asserts 2224 2224 2224 0 n/a 00:07:33.934 00:07:33.934 Elapsed time = 0.508 seconds 00:07:33.934 0 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2608686 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 2608686 ']' 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 2608686 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2608686 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2608686' 00:07:34.194 killing process with pid 2608686 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@968 -- # kill 2608686 00:07:34.194 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@973 -- # wait 2608686 00:07:34.453 15:46:39 blockdev_general.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:07:34.453 00:07:34.453 real 0m1.664s 00:07:34.453 user 0m4.376s 00:07:34.453 sys 0m0.385s 00:07:34.453 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:34.453 15:46:39 blockdev_general.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:07:34.453 ************************************ 00:07:34.453 END TEST bdev_bounds 00:07:34.453 ************************************ 00:07:34.453 15:46:39 blockdev_general -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:34.453 15:46:39 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:07:34.453 15:46:39 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:34.453 15:46:39 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:34.453 ************************************ 00:07:34.453 START TEST bdev_nbd 00:07:34.453 ************************************ 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '' 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=16 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=16 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2608949 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2608949 /var/tmp/spdk-nbd.sock 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 2608949 ']' 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:07:34.453 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:07:34.453 15:46:39 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:34.453 [2024-06-10 15:46:39.891025] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:07:34.453 [2024-06-10 15:46:39.891081] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:07:34.712 [2024-06-10 15:46:39.993295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:07:34.712 [2024-06-10 15:46:40.091603] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:07:34.971 [2024-06-10 15:46:40.245796] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:34.971 [2024-06-10 15:46:40.245858] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:07:34.971 [2024-06-10 15:46:40.245870] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:07:34.971 [2024-06-10 15:46:40.253800] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:34.971 [2024-06-10 15:46:40.253826] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:07:34.971 [2024-06-10 15:46:40.261812] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:34.971 [2024-06-10 15:46:40.261836] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:07:34.971 [2024-06-10 15:46:40.333443] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:07:34.971 [2024-06-10 15:46:40.333490] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:07:34.971 [2024-06-10 15:46:40.333504] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fea970 00:07:34.971 [2024-06-10 15:46:40.333513] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:07:34.971 [2024-06-10 15:46:40.335062] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:07:34.971 [2024-06-10 15:46:40.335089] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:35.539 15:46:40 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:35.798 1+0 records in 00:07:35.798 1+0 records out 00:07:35.798 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00023671 s, 17.3 MB/s 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:35.798 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.057 1+0 records in 00:07:36.057 1+0 records out 00:07:36.057 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228335 s, 17.9 MB/s 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.057 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.058 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:36.058 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:36.058 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.317 1+0 records in 00:07:36.317 1+0 records out 00:07:36.317 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000243322 s, 16.8 MB/s 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.317 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.577 1+0 records in 00:07:36.577 1+0 records out 00:07:36.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000255638 s, 16.0 MB/s 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:36.577 15:46:41 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd4 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd4 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd4 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:36.890 1+0 records in 00:07:36.890 1+0 records out 00:07:36.890 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267949 s, 15.3 MB/s 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd5 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd5 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd5 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:36.890 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.158 1+0 records in 00:07:37.158 1+0 records out 00:07:37.158 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281607 s, 14.5 MB/s 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd6 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd6 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd6 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:37.158 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.417 1+0 records in 00:07:37.417 1+0 records out 00:07:37.417 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000345672 s, 11.8 MB/s 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:37.417 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd7 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd7 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd7 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.676 1+0 records in 00:07:37.676 1+0 records out 00:07:37.676 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000309861 s, 13.2 MB/s 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:37.676 15:46:42 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd8 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd8 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd8 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:37.935 1+0 records in 00:07:37.935 1+0 records out 00:07:37.935 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00030786 s, 13.3 MB/s 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:37.935 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd9 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd9 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd9 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:38.195 1+0 records in 00:07:38.195 1+0 records out 00:07:38.195 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000396752 s, 10.3 MB/s 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:38.195 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd10 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd10 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd10 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:38.461 1+0 records in 00:07:38.461 1+0 records out 00:07:38.461 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000356046 s, 11.5 MB/s 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:38.461 15:46:43 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd11 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd11 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd11 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:38.724 1+0 records in 00:07:38.724 1+0 records out 00:07:38.724 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00034029 s, 12.0 MB/s 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:38.724 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd12 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd12 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd12 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:38.983 1+0 records in 00:07:38.983 1+0 records out 00:07:38.983 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000564146 s, 7.3 MB/s 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:38.983 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd13 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd13 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd13 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:39.240 1+0 records in 00:07:39.240 1+0 records out 00:07:39.240 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000470206 s, 8.7 MB/s 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:39.240 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd14 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd14 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd14 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:39.498 1+0 records in 00:07:39.498 1+0 records out 00:07:39.498 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000457861 s, 8.9 MB/s 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:39.498 15:46:44 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:39.498 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:39.498 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:39.498 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:39.498 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:39.498 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd15 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd15 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd15 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:40.065 1+0 records in 00:07:40.065 1+0 records out 00:07:40.065 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000494274 s, 8.3 MB/s 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 16 )) 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:40.065 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd0", 00:07:40.065 "bdev_name": "Malloc0" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd1", 00:07:40.065 "bdev_name": "Malloc1p0" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd2", 00:07:40.065 "bdev_name": "Malloc1p1" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd3", 00:07:40.065 "bdev_name": "Malloc2p0" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd4", 00:07:40.065 "bdev_name": "Malloc2p1" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd5", 00:07:40.065 "bdev_name": "Malloc2p2" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd6", 00:07:40.065 "bdev_name": "Malloc2p3" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd7", 00:07:40.065 "bdev_name": "Malloc2p4" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd8", 00:07:40.065 "bdev_name": "Malloc2p5" 00:07:40.065 }, 00:07:40.065 { 00:07:40.065 "nbd_device": "/dev/nbd9", 00:07:40.066 "bdev_name": "Malloc2p6" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd10", 00:07:40.066 "bdev_name": "Malloc2p7" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd11", 00:07:40.066 "bdev_name": "TestPT" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd12", 00:07:40.066 "bdev_name": "raid0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd13", 00:07:40.066 "bdev_name": "concat0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd14", 00:07:40.066 "bdev_name": "raid1" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd15", 00:07:40.066 "bdev_name": "AIO0" 00:07:40.066 } 00:07:40.066 ]' 00:07:40.066 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:07:40.066 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd0", 00:07:40.066 "bdev_name": "Malloc0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd1", 00:07:40.066 "bdev_name": "Malloc1p0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd2", 00:07:40.066 "bdev_name": "Malloc1p1" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd3", 00:07:40.066 "bdev_name": "Malloc2p0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd4", 00:07:40.066 "bdev_name": "Malloc2p1" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd5", 00:07:40.066 "bdev_name": "Malloc2p2" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd6", 00:07:40.066 "bdev_name": "Malloc2p3" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd7", 00:07:40.066 "bdev_name": "Malloc2p4" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd8", 00:07:40.066 "bdev_name": "Malloc2p5" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd9", 00:07:40.066 "bdev_name": "Malloc2p6" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd10", 00:07:40.066 "bdev_name": "Malloc2p7" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd11", 00:07:40.066 "bdev_name": "TestPT" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd12", 00:07:40.066 "bdev_name": "raid0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd13", 00:07:40.066 "bdev_name": "concat0" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd14", 00:07:40.066 "bdev_name": "raid1" 00:07:40.066 }, 00:07:40.066 { 00:07:40.066 "nbd_device": "/dev/nbd15", 00:07:40.066 "bdev_name": "AIO0" 00:07:40.066 } 00:07:40.066 ]' 00:07:40.066 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15' 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15') 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.326 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.585 15:46:45 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:40.844 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.102 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.361 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.619 15:46:46 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:41.879 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.138 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.396 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:42.397 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.397 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.397 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.397 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.655 15:46:47 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:42.655 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.655 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.655 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.655 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:42.914 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.173 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.432 15:46:48 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.691 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:43.950 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:44.209 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.468 15:46:49 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'Malloc0 Malloc1p0 Malloc1p1 Malloc2p0 Malloc2p1 Malloc2p2 Malloc2p3 Malloc2p4 Malloc2p5 Malloc2p6 Malloc2p7 TestPT raid0 concat0 raid1 AIO0' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('Malloc0' 'Malloc1p0' 'Malloc1p1' 'Malloc2p0' 'Malloc2p1' 'Malloc2p2' 'Malloc2p3' 'Malloc2p4' 'Malloc2p5' 'Malloc2p6' 'Malloc2p7' 'TestPT' 'raid0' 'concat0' 'raid1' 'AIO0') 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:44.727 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc0 /dev/nbd0 00:07:44.986 /dev/nbd0 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:44.986 1+0 records in 00:07:44.986 1+0 records out 00:07:44.986 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000188355 s, 21.7 MB/s 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:44.986 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p0 /dev/nbd1 00:07:45.245 /dev/nbd1 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:45.245 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.505 1+0 records in 00:07:45.505 1+0 records out 00:07:45.505 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00019481 s, 21.0 MB/s 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:45.505 15:46:50 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc1p1 /dev/nbd10 00:07:45.505 /dev/nbd10 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:45.764 1+0 records in 00:07:45.764 1+0 records out 00:07:45.764 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245027 s, 16.7 MB/s 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:45.764 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p0 /dev/nbd11 00:07:46.023 /dev/nbd11 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:46.023 1+0 records in 00:07:46.023 1+0 records out 00:07:46.023 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000236065 s, 17.4 MB/s 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:46.023 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p1 /dev/nbd12 00:07:46.283 /dev/nbd12 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd12 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd12 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd12 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd12 /proc/partitions 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd12 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:46.283 1+0 records in 00:07:46.283 1+0 records out 00:07:46.283 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000212514 s, 19.3 MB/s 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:46.283 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p2 /dev/nbd13 00:07:46.542 /dev/nbd13 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd13 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd13 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd13 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd13 /proc/partitions 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd13 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:46.542 1+0 records in 00:07:46.542 1+0 records out 00:07:46.542 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000226549 s, 18.1 MB/s 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:46.542 15:46:51 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p3 /dev/nbd14 00:07:46.805 /dev/nbd14 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd14 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd14 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd14 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd14 /proc/partitions 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd14 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:46.805 1+0 records in 00:07:46.805 1+0 records out 00:07:46.805 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000219319 s, 18.7 MB/s 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:46.805 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p4 /dev/nbd15 00:07:47.064 /dev/nbd15 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd15 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd15 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd15 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:47.064 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd15 /proc/partitions 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd15 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:47.065 1+0 records in 00:07:47.065 1+0 records out 00:07:47.065 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000278579 s, 14.7 MB/s 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:47.065 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p5 /dev/nbd2 00:07:47.324 /dev/nbd2 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd2 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd2 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:47.324 1+0 records in 00:07:47.324 1+0 records out 00:07:47.324 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262956 s, 15.6 MB/s 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:47.324 15:46:52 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p6 /dev/nbd3 00:07:47.583 /dev/nbd3 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd3 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd3 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:47.583 1+0 records in 00:07:47.583 1+0 records out 00:07:47.583 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000307303 s, 13.3 MB/s 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:47.583 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk Malloc2p7 /dev/nbd4 00:07:47.842 /dev/nbd4 00:07:47.842 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd4 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd4 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd4 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd4 /proc/partitions 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd4 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:48.101 1+0 records in 00:07:48.101 1+0 records out 00:07:48.101 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000264511 s, 15.5 MB/s 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:48.101 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk TestPT /dev/nbd5 00:07:48.360 /dev/nbd5 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd5 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd5 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd5 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd5 /proc/partitions 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:48.360 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd5 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:48.361 1+0 records in 00:07:48.361 1+0 records out 00:07:48.361 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000356093 s, 11.5 MB/s 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:48.361 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid0 /dev/nbd6 00:07:48.620 /dev/nbd6 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd6 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd6 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd6 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd6 /proc/partitions 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd6 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:48.620 1+0 records in 00:07:48.620 1+0 records out 00:07:48.620 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000352124 s, 11.6 MB/s 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:48.620 15:46:53 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk concat0 /dev/nbd7 00:07:48.879 /dev/nbd7 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd7 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd7 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd7 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd7 /proc/partitions 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:48.879 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd7 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:48.879 1+0 records in 00:07:48.879 1+0 records out 00:07:48.879 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.0003155 s, 13.0 MB/s 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:48.880 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk raid1 /dev/nbd8 00:07:49.139 /dev/nbd8 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd8 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd8 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd8 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd8 /proc/partitions 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd8 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.139 1+0 records in 00:07:49.139 1+0 records out 00:07:49.139 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000320726 s, 12.8 MB/s 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.139 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk AIO0 /dev/nbd9 00:07:49.398 /dev/nbd9 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd9 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd9 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd9 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd9 /proc/partitions 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd9 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:07:49.398 1+0 records in 00:07:49.398 1+0 records out 00:07:49.398 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000359313 s, 11.4 MB/s 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 16 )) 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:49.398 15:46:54 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:49.657 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd0", 00:07:49.657 "bdev_name": "Malloc0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd1", 00:07:49.657 "bdev_name": "Malloc1p0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd10", 00:07:49.657 "bdev_name": "Malloc1p1" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd11", 00:07:49.657 "bdev_name": "Malloc2p0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd12", 00:07:49.657 "bdev_name": "Malloc2p1" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd13", 00:07:49.657 "bdev_name": "Malloc2p2" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd14", 00:07:49.657 "bdev_name": "Malloc2p3" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd15", 00:07:49.657 "bdev_name": "Malloc2p4" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd2", 00:07:49.657 "bdev_name": "Malloc2p5" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd3", 00:07:49.657 "bdev_name": "Malloc2p6" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd4", 00:07:49.657 "bdev_name": "Malloc2p7" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd5", 00:07:49.657 "bdev_name": "TestPT" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd6", 00:07:49.657 "bdev_name": "raid0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd7", 00:07:49.657 "bdev_name": "concat0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd8", 00:07:49.657 "bdev_name": "raid1" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd9", 00:07:49.657 "bdev_name": "AIO0" 00:07:49.657 } 00:07:49.657 ]' 00:07:49.657 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd0", 00:07:49.657 "bdev_name": "Malloc0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd1", 00:07:49.657 "bdev_name": "Malloc1p0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd10", 00:07:49.657 "bdev_name": "Malloc1p1" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd11", 00:07:49.657 "bdev_name": "Malloc2p0" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd12", 00:07:49.657 "bdev_name": "Malloc2p1" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd13", 00:07:49.657 "bdev_name": "Malloc2p2" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd14", 00:07:49.657 "bdev_name": "Malloc2p3" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd15", 00:07:49.657 "bdev_name": "Malloc2p4" 00:07:49.657 }, 00:07:49.657 { 00:07:49.657 "nbd_device": "/dev/nbd2", 00:07:49.657 "bdev_name": "Malloc2p5" 00:07:49.657 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd3", 00:07:49.658 "bdev_name": "Malloc2p6" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd4", 00:07:49.658 "bdev_name": "Malloc2p7" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd5", 00:07:49.658 "bdev_name": "TestPT" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd6", 00:07:49.658 "bdev_name": "raid0" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd7", 00:07:49.658 "bdev_name": "concat0" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd8", 00:07:49.658 "bdev_name": "raid1" 00:07:49.658 }, 00:07:49.658 { 00:07:49.658 "nbd_device": "/dev/nbd9", 00:07:49.658 "bdev_name": "AIO0" 00:07:49.658 } 00:07:49.658 ]' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:07:49.658 /dev/nbd1 00:07:49.658 /dev/nbd10 00:07:49.658 /dev/nbd11 00:07:49.658 /dev/nbd12 00:07:49.658 /dev/nbd13 00:07:49.658 /dev/nbd14 00:07:49.658 /dev/nbd15 00:07:49.658 /dev/nbd2 00:07:49.658 /dev/nbd3 00:07:49.658 /dev/nbd4 00:07:49.658 /dev/nbd5 00:07:49.658 /dev/nbd6 00:07:49.658 /dev/nbd7 00:07:49.658 /dev/nbd8 00:07:49.658 /dev/nbd9' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:07:49.658 /dev/nbd1 00:07:49.658 /dev/nbd10 00:07:49.658 /dev/nbd11 00:07:49.658 /dev/nbd12 00:07:49.658 /dev/nbd13 00:07:49.658 /dev/nbd14 00:07:49.658 /dev/nbd15 00:07:49.658 /dev/nbd2 00:07:49.658 /dev/nbd3 00:07:49.658 /dev/nbd4 00:07:49.658 /dev/nbd5 00:07:49.658 /dev/nbd6 00:07:49.658 /dev/nbd7 00:07:49.658 /dev/nbd8 00:07:49.658 /dev/nbd9' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=16 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 16 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=16 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 16 -ne 16 ']' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' write 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:07:49.658 256+0 records in 00:07:49.658 256+0 records out 00:07:49.658 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00988607 s, 106 MB/s 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:49.658 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:07:49.917 256+0 records in 00:07:49.917 256+0 records out 00:07:49.917 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0631522 s, 16.6 MB/s 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:07:49.917 256+0 records in 00:07:49.917 256+0 records out 00:07:49.917 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652144 s, 16.1 MB/s 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:07:49.917 256+0 records in 00:07:49.917 256+0 records out 00:07:49.917 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652305 s, 16.1 MB/s 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:07:49.917 256+0 records in 00:07:49.917 256+0 records out 00:07:49.917 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652375 s, 16.1 MB/s 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:49.917 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd12 bs=4096 count=256 oflag=direct 00:07:50.177 256+0 records in 00:07:50.177 256+0 records out 00:07:50.177 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652468 s, 16.1 MB/s 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd13 bs=4096 count=256 oflag=direct 00:07:50.177 256+0 records in 00:07:50.177 256+0 records out 00:07:50.177 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0647463 s, 16.2 MB/s 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd14 bs=4096 count=256 oflag=direct 00:07:50.177 256+0 records in 00:07:50.177 256+0 records out 00:07:50.177 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0652703 s, 16.1 MB/s 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.177 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd15 bs=4096 count=256 oflag=direct 00:07:50.436 256+0 records in 00:07:50.436 256+0 records out 00:07:50.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0657414 s, 16.0 MB/s 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd2 bs=4096 count=256 oflag=direct 00:07:50.436 256+0 records in 00:07:50.436 256+0 records out 00:07:50.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0648172 s, 16.2 MB/s 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd3 bs=4096 count=256 oflag=direct 00:07:50.436 256+0 records in 00:07:50.436 256+0 records out 00:07:50.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0647475 s, 16.2 MB/s 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd4 bs=4096 count=256 oflag=direct 00:07:50.436 256+0 records in 00:07:50.436 256+0 records out 00:07:50.436 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0648786 s, 16.2 MB/s 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.436 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd5 bs=4096 count=256 oflag=direct 00:07:50.695 256+0 records in 00:07:50.695 256+0 records out 00:07:50.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0648797 s, 16.2 MB/s 00:07:50.695 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.695 15:46:55 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd6 bs=4096 count=256 oflag=direct 00:07:50.695 256+0 records in 00:07:50.695 256+0 records out 00:07:50.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0641525 s, 16.3 MB/s 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd7 bs=4096 count=256 oflag=direct 00:07:50.695 256+0 records in 00:07:50.695 256+0 records out 00:07:50.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0643258 s, 16.3 MB/s 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd8 bs=4096 count=256 oflag=direct 00:07:50.695 256+0 records in 00:07:50.695 256+0 records out 00:07:50.695 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0672944 s, 15.6 MB/s 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:07:50.695 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd9 bs=4096 count=256 oflag=direct 00:07:50.954 256+0 records in 00:07:50.954 256+0 records out 00:07:50.954 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0639945 s, 16.4 MB/s 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' verify 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd12 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd13 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd14 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd15 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd2 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd3 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd4 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd5 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd6 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd7 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd8 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd9 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:50.954 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.213 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.472 15:46:56 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.731 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:51.989 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd12 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd12 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd12 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd12 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd12 /proc/partitions 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:52.248 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd13 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd13 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd13 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd13 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd13 /proc/partitions 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.507 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:52.508 15:46:57 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd14 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd14 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd14 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd14 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd14 /proc/partitions 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:52.831 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd15 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd15 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd15 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd15 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd15 /proc/partitions 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.090 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:07:53.349 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:07:53.349 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:07:53.349 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.350 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:07:53.609 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:07:53.609 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:07:53.609 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:07:53.609 15:46:58 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.609 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd4 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd4 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd4 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd4 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd4 /proc/partitions 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:53.868 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd5 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd5 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd5 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd5 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd5 /proc/partitions 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.126 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd6 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd6 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd6 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd6 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd6 /proc/partitions 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.385 15:46:59 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd7 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd7 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd7 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd7 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd7 /proc/partitions 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.644 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd8 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd8 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd8 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd8 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd8 /proc/partitions 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:54.903 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd9 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd9 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd9 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd9 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd9 /proc/partitions 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:07:55.471 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:07:55.731 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:07:55.731 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:07:55.731 15:47:00 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11 /dev/nbd12 /dev/nbd13 /dev/nbd14 /dev/nbd15 /dev/nbd2 /dev/nbd3 /dev/nbd4 /dev/nbd5 /dev/nbd6 /dev/nbd7 /dev/nbd8 /dev/nbd9' 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:07:55.731 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:07:55.989 malloc_lvol_verify 00:07:55.989 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:07:55.989 393f7654-4b60-4d41-9f4f-d930baaf0f1d 00:07:56.248 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:07:56.248 3e8c3b88-851b-4418-a2d9-aac751929b3d 00:07:56.506 15:47:01 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:07:56.506 /dev/nbd0 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:07:56.767 mke2fs 1.46.5 (30-Dec-2021) 00:07:56.767 Discarding device blocks: 0/4096 done 00:07:56.767 Creating filesystem with 4096 1k blocks and 1024 inodes 00:07:56.767 00:07:56.767 Allocating group tables: 0/1 done 00:07:56.767 Writing inode tables: 0/1 done 00:07:56.767 Creating journal (1024 blocks): done 00:07:56.767 Writing superblocks and filesystem accounting information: 0/1 done 00:07:56.767 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:07:56.767 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2608949 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 2608949 ']' 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 2608949 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:07:57.025 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2608949 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2608949' 00:07:57.026 killing process with pid 2608949 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@968 -- # kill 2608949 00:07:57.026 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@973 -- # wait 2608949 00:07:57.284 15:47:02 blockdev_general.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:07:57.284 00:07:57.284 real 0m22.793s 00:07:57.284 user 0m32.368s 00:07:57.284 sys 0m10.199s 00:07:57.284 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:07:57.284 15:47:02 blockdev_general.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:07:57.284 ************************************ 00:07:57.284 END TEST bdev_nbd 00:07:57.284 ************************************ 00:07:57.284 15:47:02 blockdev_general -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:07:57.284 15:47:02 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = nvme ']' 00:07:57.284 15:47:02 blockdev_general -- bdev/blockdev.sh@764 -- # '[' bdev = gpt ']' 00:07:57.284 15:47:02 blockdev_general -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:07:57.284 15:47:02 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:07:57.284 15:47:02 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:57.284 15:47:02 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:07:57.284 ************************************ 00:07:57.284 START TEST bdev_fio 00:07:57.284 ************************************ 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:07:57.284 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:07:57.284 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc1p1]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc1p1 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p1]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p1 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p2]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p2 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p3]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p3 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p4]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p4 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p5]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p5 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p6]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p6 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_Malloc2p7]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=Malloc2p7 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_TestPT]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=TestPT 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_concat0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=concat0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_raid1]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=raid1 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_AIO0]' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=AIO0 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:07:57.285 15:47:02 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:07:57.543 ************************************ 00:07:57.543 START TEST bdev_fio_rw_verify 00:07:57.543 ************************************ 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:07:57.543 15:47:02 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:07:57.800 job_Malloc0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc1p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc1p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p5: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p6: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_Malloc2p7: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.800 job_TestPT: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.801 job_raid0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.801 job_concat0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.801 job_raid1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.801 job_AIO0: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:07:57.801 fio-3.35 00:07:57.801 Starting 16 threads 00:08:10.004 00:08:10.004 job_Malloc0: (groupid=0, jobs=16): err= 0: pid=2613536: Mon Jun 10 15:47:13 2024 00:08:10.004 read: IOPS=85.5k, BW=334MiB/s (350MB/s)(3341MiB/10001msec) 00:08:10.004 slat (nsec): min=1922, max=346131, avg=36522.48, stdev=18214.90 00:08:10.004 clat (usec): min=14, max=1263, avg=308.43, stdev=160.35 00:08:10.004 lat (usec): min=25, max=1318, avg=344.95, stdev=171.77 00:08:10.004 clat percentiles (usec): 00:08:10.004 | 50.000th=[ 293], 99.000th=[ 701], 99.900th=[ 775], 99.990th=[ 1074], 00:08:10.004 | 99.999th=[ 1205] 00:08:10.004 write: IOPS=135k, BW=527MiB/s (552MB/s)(5191MiB/9856msec); 0 zone resets 00:08:10.004 slat (usec): min=7, max=3134, avg=50.90, stdev=19.60 00:08:10.004 clat (usec): min=13, max=1585, avg=363.91, stdev=184.00 00:08:10.004 lat (usec): min=37, max=3483, avg=414.82, stdev=195.81 00:08:10.004 clat percentiles (usec): 00:08:10.004 | 50.000th=[ 338], 99.000th=[ 857], 99.900th=[ 1156], 99.990th=[ 1237], 00:08:10.004 | 99.999th=[ 1303] 00:08:10.004 bw ( KiB/s): min=430546, max=710953, per=98.70%, avg=532300.05, stdev=4896.70, samples=304 00:08:10.004 iops : min=107635, max=177735, avg=133073.68, stdev=1224.16, samples=304 00:08:10.004 lat (usec) : 20=0.01%, 50=0.42%, 100=4.88%, 250=29.33%, 500=47.38% 00:08:10.004 lat (usec) : 750=15.83%, 1000=1.88% 00:08:10.004 lat (msec) : 2=0.27% 00:08:10.004 cpu : usr=99.27%, sys=0.34%, ctx=591, majf=0, minf=1627 00:08:10.004 IO depths : 1=12.3%, 2=24.6%, 4=50.4%, 8=12.6%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:10.004 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:10.004 complete : 0=0.0%, 4=89.1%, 8=10.9%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:10.004 issued rwts: total=855361,1328881,0,0 short=0,0,0,0 dropped=0,0,0,0 00:08:10.004 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:10.004 00:08:10.004 Run status group 0 (all jobs): 00:08:10.004 READ: bw=334MiB/s (350MB/s), 334MiB/s-334MiB/s (350MB/s-350MB/s), io=3341MiB (3504MB), run=10001-10001msec 00:08:10.004 WRITE: bw=527MiB/s (552MB/s), 527MiB/s-527MiB/s (552MB/s-552MB/s), io=5191MiB (5443MB), run=9856-9856msec 00:08:10.004 00:08:10.004 real 0m11.414s 00:08:10.004 user 2m48.743s 00:08:10.004 sys 0m1.646s 00:08:10.004 15:47:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:10.004 15:47:14 blockdev_general.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:08:10.004 ************************************ 00:08:10.004 END TEST bdev_fio_rw_verify 00:08:10.004 ************************************ 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:08:10.004 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:10.005 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2c706d87-de1b-59dc-8209-c4e04762af52"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2c706d87-de1b-59dc-8209-c4e04762af52",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "52ba6ad6-c80d-5115-bce1-3544c2fe835a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "52ba6ad6-c80d-5115-bce1-3544c2fe835a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e0723f33-2174-5066-b830-959a10522ba9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0723f33-2174-5066-b830-959a10522ba9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "9eefeea6-1e23-54d2-b872-0ee3937d11e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9eefeea6-1e23-54d2-b872-0ee3937d11e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "fab681ea-e899-525d-a19e-458983bbac92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fab681ea-e899-525d-a19e-458983bbac92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "0e5efa97-7aea-5f4a-89be-9cafbbad6af7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0e5efa97-7aea-5f4a-89be-9cafbbad6af7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "33b14985-c68a-58e6-a058-db93c606f53b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "33b14985-c68a-58e6-a058-db93c606f53b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "043436c2-d82e-5478-b73f-b6d730ff0d4a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "043436c2-d82e-5478-b73f-b6d730ff0d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "917a38d6-a1af-4820-91f7-ac54a5b92219"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4f4ef910-7534-4851-af9f-f1e1887f1ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "513b55f0-736e-433a-aba1-8ff25a06c835",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d60d76a3-deb8-4c63-b6f7-c810b8b0f488"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0af73327-78cc-47df-9f47-d4318dbd77b5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "10421489-1baf-4333-9437-07f396cb9815",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "fc2a8214-740f-46f4-94c1-4feb26f228bc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "c7f05316-48e3-443a-a21d-cd4404f89ae3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e533cf7b-ba4d-47d6-96bc-86089a81efbf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0ea3f1f9-8469-464a-aea4-a557bece1444"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0ea3f1f9-8469-464a-aea4-a557bece1444",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:10.006 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n Malloc0 00:08:10.006 Malloc1p0 00:08:10.006 Malloc1p1 00:08:10.006 Malloc2p0 00:08:10.006 Malloc2p1 00:08:10.006 Malloc2p2 00:08:10.006 Malloc2p3 00:08:10.006 Malloc2p4 00:08:10.006 Malloc2p5 00:08:10.006 Malloc2p6 00:08:10.006 Malloc2p7 00:08:10.006 TestPT 00:08:10.006 raid0 00:08:10.006 concat0 ]] 00:08:10.006 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "Malloc0",' ' "aliases": [' ' "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7"' ' ],' ' "product_name": "Malloc disk",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ef39c1dd-4aae-4a37-bf6a-30c01da83ea7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 20000,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {}' '}' '{' ' "name": "Malloc1p0",' ' "aliases": [' ' "2c706d87-de1b-59dc-8209-c4e04762af52"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "2c706d87-de1b-59dc-8209-c4e04762af52",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc1p1",' ' "aliases": [' ' "52ba6ad6-c80d-5115-bce1-3544c2fe835a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "52ba6ad6-c80d-5115-bce1-3544c2fe835a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc1",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p0",' ' "aliases": [' ' "e0723f33-2174-5066-b830-959a10522ba9"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "e0723f33-2174-5066-b830-959a10522ba9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 0' ' }' ' }' '}' '{' ' "name": "Malloc2p1",' ' "aliases": [' ' "9eefeea6-1e23-54d2-b872-0ee3937d11e2"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "9eefeea6-1e23-54d2-b872-0ee3937d11e2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 8192' ' }' ' }' '}' '{' ' "name": "Malloc2p2",' ' "aliases": [' ' "fab681ea-e899-525d-a19e-458983bbac92"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "fab681ea-e899-525d-a19e-458983bbac92",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 16384' ' }' ' }' '}' '{' ' "name": "Malloc2p3",' ' "aliases": [' ' "0e5efa97-7aea-5f4a-89be-9cafbbad6af7"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "0e5efa97-7aea-5f4a-89be-9cafbbad6af7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 24576' ' }' ' }' '}' '{' ' "name": "Malloc2p4",' ' "aliases": [' ' "33b14985-c68a-58e6-a058-db93c606f53b"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "33b14985-c68a-58e6-a058-db93c606f53b",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 32768' ' }' ' }' '}' '{' ' "name": "Malloc2p5",' ' "aliases": [' ' "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "1e0fce14-dda7-5cfd-b8a8-5dc71c46a2d1",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 40960' ' }' ' }' '}' '{' ' "name": "Malloc2p6",' ' "aliases": [' ' "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "8ad6f11d-21ea-5c39-94cd-165ca4f74e7a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 49152' ' }' ' }' '}' '{' ' "name": "Malloc2p7",' ' "aliases": [' ' "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae"' ' ],' ' "product_name": "Split Disk",' ' "block_size": 512,' ' "num_blocks": 8192,' ' "uuid": "7fe7b45d-2696-5735-aabd-d0e1ff0a19ae",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "split": {' ' "base_bdev": "Malloc2",' ' "offset_blocks": 57344' ' }' ' }' '}' '{' ' "name": "TestPT",' ' "aliases": [' ' "043436c2-d82e-5478-b73f-b6d730ff0d4a"' ' ],' ' "product_name": "passthru",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "043436c2-d82e-5478-b73f-b6d730ff0d4a",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": true,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "passthru": {' ' "name": "TestPT",' ' "base_bdev_name": "Malloc3"' ' }' ' }' '}' '{' ' "name": "raid0",' ' "aliases": [' ' "917a38d6-a1af-4820-91f7-ac54a5b92219"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "917a38d6-a1af-4820-91f7-ac54a5b92219",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "raid0",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc4",' ' "uuid": "4f4ef910-7534-4851-af9f-f1e1887f1ded",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc5",' ' "uuid": "513b55f0-736e-433a-aba1-8ff25a06c835",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "concat0",' ' "aliases": [' ' "d60d76a3-deb8-4c63-b6f7-c810b8b0f488"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 131072,' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "d60d76a3-deb8-4c63-b6f7-c810b8b0f488",' ' "strip_size_kb": 64,' ' "state": "online",' ' "raid_level": "concat",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc6",' ' "uuid": "0af73327-78cc-47df-9f47-d4318dbd77b5",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc7",' ' "uuid": "10421489-1baf-4333-9437-07f396cb9815",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "raid1",' ' "aliases": [' ' "fc2a8214-740f-46f4-94c1-4feb26f228bc"' ' ],' ' "product_name": "Raid Volume",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": false,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "raid": {' ' "uuid": "fc2a8214-740f-46f4-94c1-4feb26f228bc",' ' "strip_size_kb": 0,' ' "state": "online",' ' "raid_level": "raid1",' ' "superblock": false,' ' "num_base_bdevs": 2,' ' "num_base_bdevs_discovered": 2,' ' "num_base_bdevs_operational": 2,' ' "base_bdevs_list": [' ' {' ' "name": "Malloc8",' ' "uuid": "c7f05316-48e3-443a-a21d-cd4404f89ae3",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' },' ' {' ' "name": "Malloc9",' ' "uuid": "e533cf7b-ba4d-47d6-96bc-86089a81efbf",' ' "is_configured": true,' ' "data_offset": 0,' ' "data_size": 65536' ' }' ' ]' ' }' ' }' '}' '{' ' "name": "AIO0",' ' "aliases": [' ' "0ea3f1f9-8469-464a-aea4-a557bece1444"' ' ],' ' "product_name": "AIO disk",' ' "block_size": 2048,' ' "num_blocks": 5000,' ' "uuid": "0ea3f1f9-8469-464a-aea4-a557bece1444",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": false,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "driver_specific": {' ' "aio": {' ' "filename": "/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile",' ' "block_size_override": true,' ' "readonly": false,' ' "fallocate": false' ' }' ' }' '}' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc0]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc0 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p0]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p0 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc1p1]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc1p1 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p0]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p0 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p1]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p1 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p2]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p2 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p3]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p3 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p4]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p4 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p5]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p5 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p6]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p6 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_Malloc2p7]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=Malloc2p7 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_TestPT]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=TestPT 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_raid0]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=raid0 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_concat0]' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=concat0 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:10.007 15:47:14 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:10.007 ************************************ 00:08:10.007 START TEST bdev_fio_trim 00:08:10.007 ************************************ 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:08:10.007 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:08:10.008 15:47:14 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:08:10.008 job_Malloc0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc1p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc1p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p5: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p6: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_Malloc2p7: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_TestPT: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_raid0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 job_concat0: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:08:10.008 fio-3.35 00:08:10.008 Starting 14 threads 00:08:22.213 00:08:22.213 job_Malloc0: (groupid=0, jobs=14): err= 0: pid=2615708: Mon Jun 10 15:47:25 2024 00:08:22.213 write: IOPS=125k, BW=490MiB/s (513MB/s)(4896MiB/10001msec); 0 zone resets 00:08:22.213 slat (usec): min=7, max=2767, avg=39.41, stdev=11.41 00:08:22.213 clat (usec): min=34, max=3046, avg=278.58, stdev=99.28 00:08:22.213 lat (usec): min=48, max=3091, avg=317.99, stdev=103.96 00:08:22.213 clat percentiles (usec): 00:08:22.213 | 50.000th=[ 269], 99.000th=[ 529], 99.900th=[ 586], 99.990th=[ 627], 00:08:22.213 | 99.999th=[ 955] 00:08:22.213 bw ( KiB/s): min=435872, max=650017, per=100.00%, avg=502920.47, stdev=3643.46, samples=266 00:08:22.213 iops : min=108968, max=162501, avg=125729.95, stdev=910.83, samples=266 00:08:22.213 trim: IOPS=125k, BW=490MiB/s (513MB/s)(4896MiB/10001msec); 0 zone resets 00:08:22.213 slat (usec): min=4, max=335, avg=26.70, stdev= 7.40 00:08:22.213 clat (usec): min=42, max=3092, avg=318.18, stdev=103.97 00:08:22.213 lat (usec): min=55, max=3108, avg=344.88, stdev=107.36 00:08:22.213 clat percentiles (usec): 00:08:22.213 | 50.000th=[ 310], 99.000th=[ 578], 99.900th=[ 644], 99.990th=[ 693], 00:08:22.213 | 99.999th=[ 1045] 00:08:22.213 bw ( KiB/s): min=435872, max=650017, per=100.00%, avg=502920.47, stdev=3643.46, samples=266 00:08:22.213 iops : min=108968, max=162501, avg=125729.95, stdev=910.83, samples=266 00:08:22.213 lat (usec) : 50=0.03%, 100=0.75%, 250=35.43%, 500=60.03%, 750=3.77% 00:08:22.213 lat (usec) : 1000=0.01% 00:08:22.213 lat (msec) : 2=0.01%, 4=0.01% 00:08:22.213 cpu : usr=99.61%, sys=0.00%, ctx=593, majf=0, minf=713 00:08:22.213 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:08:22.213 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:22.213 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:08:22.213 issued rwts: total=0,1253430,1253433,0 short=0,0,0,0 dropped=0,0,0,0 00:08:22.213 latency : target=0, window=0, percentile=100.00%, depth=8 00:08:22.213 00:08:22.213 Run status group 0 (all jobs): 00:08:22.213 WRITE: bw=490MiB/s (513MB/s), 490MiB/s-490MiB/s (513MB/s-513MB/s), io=4896MiB (5134MB), run=10001-10001msec 00:08:22.213 TRIM: bw=490MiB/s (513MB/s), 490MiB/s-490MiB/s (513MB/s-513MB/s), io=4896MiB (5134MB), run=10001-10001msec 00:08:22.213 00:08:22.213 real 0m11.729s 00:08:22.213 user 2m30.049s 00:08:22.213 sys 0m0.690s 00:08:22.213 15:47:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:22.213 15:47:26 blockdev_general.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:08:22.213 ************************************ 00:08:22.213 END TEST bdev_fio_trim 00:08:22.213 ************************************ 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:08:22.213 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:08:22.213 00:08:22.213 real 0m23.482s 00:08:22.213 user 5m18.992s 00:08:22.213 sys 0m2.498s 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:22.213 15:47:26 blockdev_general.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:08:22.213 ************************************ 00:08:22.213 END TEST bdev_fio 00:08:22.213 ************************************ 00:08:22.213 15:47:26 blockdev_general -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:08:22.213 15:47:26 blockdev_general -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:22.213 15:47:26 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:08:22.213 15:47:26 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:22.213 15:47:26 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:22.213 ************************************ 00:08:22.213 START TEST bdev_verify 00:08:22.213 ************************************ 00:08:22.213 15:47:26 blockdev_general.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:08:22.213 [2024-06-10 15:47:26.297493] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:22.213 [2024-06-10 15:47:26.297548] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2617554 ] 00:08:22.213 [2024-06-10 15:47:26.396321] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:22.213 [2024-06-10 15:47:26.489057] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:22.213 [2024-06-10 15:47:26.489062] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:22.214 [2024-06-10 15:47:26.630091] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:22.214 [2024-06-10 15:47:26.630137] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:22.214 [2024-06-10 15:47:26.630148] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:22.214 [2024-06-10 15:47:26.638096] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:22.214 [2024-06-10 15:47:26.638121] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:22.214 [2024-06-10 15:47:26.646111] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:22.214 [2024-06-10 15:47:26.646133] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:22.214 [2024-06-10 15:47:26.717873] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:22.214 [2024-06-10 15:47:26.717920] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:22.214 [2024-06-10 15:47:26.717935] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x11ac470 00:08:22.214 [2024-06-10 15:47:26.717945] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:22.214 [2024-06-10 15:47:26.719569] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:22.214 [2024-06-10 15:47:26.719601] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:22.214 Running I/O for 5 seconds... 00:08:27.488 00:08:27.488 Latency(us) 00:08:27.488 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:27.488 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x1000 00:08:27.488 Malloc0 : 5.18 1088.12 4.25 0.00 0.00 117428.80 643.66 238675.87 00:08:27.488 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x1000 length 0x1000 00:08:27.488 Malloc0 : 5.21 1080.72 4.22 0.00 0.00 118225.38 639.76 387473.80 00:08:27.488 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x800 00:08:27.488 Malloc1p0 : 5.23 562.94 2.20 0.00 0.00 226309.00 3760.52 229688.08 00:08:27.488 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x800 length 0x800 00:08:27.488 Malloc1p0 : 5.21 564.68 2.21 0.00 0.00 225635.15 3822.93 215707.06 00:08:27.488 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x800 00:08:27.488 Malloc1p1 : 5.23 562.54 2.20 0.00 0.00 225822.04 3729.31 223696.21 00:08:27.488 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x800 length 0x800 00:08:27.488 Malloc1p1 : 5.22 564.43 2.20 0.00 0.00 225080.60 3713.71 211712.49 00:08:27.488 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p0 : 5.24 562.16 2.20 0.00 0.00 225343.65 3760.52 219701.64 00:08:27.488 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p0 : 5.22 564.19 2.20 0.00 0.00 224541.51 3729.31 206719.27 00:08:27.488 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p1 : 5.24 561.79 2.19 0.00 0.00 224874.59 3744.91 214708.42 00:08:27.488 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p1 : 5.22 563.92 2.20 0.00 0.00 224020.27 3776.12 201726.05 00:08:27.488 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p2 : 5.24 561.50 2.19 0.00 0.00 224326.41 3760.52 210713.84 00:08:27.488 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p2 : 5.22 563.66 2.20 0.00 0.00 223457.13 3760.52 196732.83 00:08:27.488 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p3 : 5.25 560.88 2.19 0.00 0.00 223941.38 3698.10 206719.27 00:08:27.488 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p3 : 5.23 563.39 2.20 0.00 0.00 222936.35 3744.91 193736.90 00:08:27.488 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p4 : 5.25 560.35 2.19 0.00 0.00 223548.99 3791.73 202724.69 00:08:27.488 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p4 : 5.23 563.13 2.20 0.00 0.00 222429.54 3744.91 189742.32 00:08:27.488 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p5 : 5.26 560.10 2.19 0.00 0.00 223034.58 3744.91 200727.41 00:08:27.488 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p5 : 5.23 562.87 2.20 0.00 0.00 221924.75 3729.31 185747.75 00:08:27.488 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p6 : 5.26 559.87 2.19 0.00 0.00 222528.74 3651.29 198730.12 00:08:27.488 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p6 : 5.23 562.46 2.20 0.00 0.00 221493.18 3635.69 183750.46 00:08:27.488 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x200 00:08:27.488 Malloc2p7 : 5.26 559.63 2.19 0.00 0.00 221995.55 3588.88 198730.12 00:08:27.488 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x200 length 0x200 00:08:27.488 Malloc2p7 : 5.24 562.08 2.20 0.00 0.00 221033.86 3620.08 179755.89 00:08:27.488 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x0 length 0x1000 00:08:27.488 TestPT : 5.28 557.58 2.18 0.00 0.00 222087.13 15728.64 199728.76 00:08:27.488 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.488 Verification LBA range: start 0x1000 length 0x1000 00:08:27.488 TestPT : 5.26 537.72 2.10 0.00 0.00 229407.81 17101.78 271631.12 00:08:27.488 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x0 length 0x2000 00:08:27.489 raid0 : 5.27 559.13 2.18 0.00 0.00 220688.02 3604.48 178757.24 00:08:27.489 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x2000 length 0x2000 00:08:27.489 raid0 : 5.24 561.51 2.19 0.00 0.00 219739.77 3635.69 161780.30 00:08:27.489 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x0 length 0x2000 00:08:27.489 concat0 : 5.27 558.68 2.18 0.00 0.00 220270.35 3682.50 175761.31 00:08:27.489 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x2000 length 0x2000 00:08:27.489 concat0 : 5.25 560.90 2.19 0.00 0.00 219430.62 3666.90 159783.01 00:08:27.489 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x0 length 0x1000 00:08:27.489 raid1 : 5.27 558.41 2.18 0.00 0.00 219722.46 4275.44 170768.09 00:08:27.489 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x1000 length 0x1000 00:08:27.489 raid1 : 5.27 582.47 2.28 0.00 0.00 210710.71 2949.12 165774.87 00:08:27.489 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x0 length 0x4e2 00:08:27.489 AIO0 : 5.27 558.20 2.18 0.00 0.00 219141.32 1458.96 170768.09 00:08:27.489 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:08:27.489 Verification LBA range: start 0x4e2 length 0x4e2 00:08:27.489 AIO0 : 5.28 581.90 2.27 0.00 0.00 210284.50 1458.96 173764.02 00:08:27.489 =================================================================================================================== 00:08:27.489 Total : 19031.89 74.34 0.00 0.00 210365.96 639.76 387473.80 00:08:27.489 00:08:27.489 real 0m6.414s 00:08:27.489 user 0m12.003s 00:08:27.489 sys 0m0.331s 00:08:27.489 15:47:32 blockdev_general.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:27.489 15:47:32 blockdev_general.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:08:27.489 ************************************ 00:08:27.489 END TEST bdev_verify 00:08:27.489 ************************************ 00:08:27.489 15:47:32 blockdev_general -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:27.489 15:47:32 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:08:27.489 15:47:32 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:27.489 15:47:32 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:27.489 ************************************ 00:08:27.489 START TEST bdev_verify_big_io 00:08:27.489 ************************************ 00:08:27.489 15:47:32 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:08:27.489 [2024-06-10 15:47:32.784720] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:27.489 [2024-06-10 15:47:32.784778] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2618533 ] 00:08:27.489 [2024-06-10 15:47:32.882278] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:08:27.489 [2024-06-10 15:47:32.977832] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:27.489 [2024-06-10 15:47:32.977838] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:27.749 [2024-06-10 15:47:33.120701] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:27.749 [2024-06-10 15:47:33.120751] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:27.749 [2024-06-10 15:47:33.120762] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:27.749 [2024-06-10 15:47:33.128708] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:27.749 [2024-06-10 15:47:33.128731] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:27.749 [2024-06-10 15:47:33.136721] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:27.749 [2024-06-10 15:47:33.136743] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:27.749 [2024-06-10 15:47:33.209072] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:27.749 [2024-06-10 15:47:33.209120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:27.749 [2024-06-10 15:47:33.209135] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1b67470 00:08:27.749 [2024-06-10 15:47:33.209145] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:27.749 [2024-06-10 15:47:33.210725] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:27.749 [2024-06-10 15:47:33.210751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:28.008 [2024-06-10 15:47:33.377084] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.378199] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p0 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.379812] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.380897] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p1 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.382541] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.383621] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p2 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.385117] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.386411] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p3 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.387248] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.388539] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p4 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.389352] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.390654] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p5 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.391466] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.392752] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p6 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.393573] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.394869] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev Malloc2p7 simultaneously (32). Queue depth is limited to 32 00:08:28.009 [2024-06-10 15:47:33.415797] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:28.009 [2024-06-10 15:47:33.417552] bdevperf.c:1818:bdevperf_construct_job: *WARNING*: Due to constraints of verify job, queue depth (-q, 128) can't exceed the number of IO requests which can be submitted to the bdev AIO0 simultaneously (78). Queue depth is limited to 78 00:08:28.009 Running I/O for 5 seconds... 00:08:36.202 00:08:36.202 Latency(us) 00:08:36.202 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:36.202 Job: Malloc0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x100 00:08:36.202 Malloc0 : 5.99 128.31 8.02 0.00 0.00 975821.52 959.63 2588484.75 00:08:36.202 Job: Malloc0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x100 length 0x100 00:08:36.202 Malloc0 : 6.10 125.89 7.87 0.00 0.00 995056.10 932.33 2604463.06 00:08:36.202 Job: Malloc1p0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x80 00:08:36.202 Malloc1p0 : 6.49 49.30 3.08 0.00 0.00 2375546.95 2933.52 3754900.72 00:08:36.202 Job: Malloc1p0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x80 length 0x80 00:08:36.202 Malloc1p0 : 6.46 60.69 3.79 0.00 0.00 1942910.07 3198.78 3115768.69 00:08:36.202 Job: Malloc1p1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x80 00:08:36.202 Malloc1p1 : 6.95 32.24 2.02 0.00 0.00 3396263.10 1575.98 5624361.94 00:08:36.202 Job: Malloc1p1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x80 length 0x80 00:08:36.202 Malloc1p1 : 6.97 32.15 2.01 0.00 0.00 3428659.18 1544.78 5688275.14 00:08:36.202 Job: Malloc2p0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p0 : 6.41 22.48 1.41 0.00 0.00 1248914.90 655.36 2173048.93 00:08:36.202 Job: Malloc2p0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p0 : 6.38 20.06 1.25 0.00 0.00 1375959.24 655.36 2220983.83 00:08:36.202 Job: Malloc2p1 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p1 : 6.41 22.48 1.40 0.00 0.00 1236879.32 647.56 2141092.33 00:08:36.202 Job: Malloc2p1 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p1 : 6.38 20.05 1.25 0.00 0.00 1361486.34 655.36 2189027.23 00:08:36.202 Job: Malloc2p2 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p2 : 6.41 22.47 1.40 0.00 0.00 1225031.88 643.66 2109135.73 00:08:36.202 Job: Malloc2p2 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p2 : 6.46 22.29 1.39 0.00 0.00 1237198.95 659.26 2157070.63 00:08:36.202 Job: Malloc2p3 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p3 : 6.41 22.47 1.40 0.00 0.00 1213188.00 667.06 2093157.42 00:08:36.202 Job: Malloc2p3 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p3 : 6.46 22.28 1.39 0.00 0.00 1224357.05 663.16 2125114.03 00:08:36.202 Job: Malloc2p4 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p4 : 6.41 22.46 1.40 0.00 0.00 1200486.93 651.46 2061200.82 00:08:36.202 Job: Malloc2p4 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p4 : 6.46 22.28 1.39 0.00 0.00 1211985.61 663.16 2093157.42 00:08:36.202 Job: Malloc2p5 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p5 : 6.41 22.46 1.40 0.00 0.00 1188308.91 651.46 2021255.07 00:08:36.202 Job: Malloc2p5 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p5 : 6.47 22.27 1.39 0.00 0.00 1200632.48 655.36 2077179.12 00:08:36.202 Job: Malloc2p6 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p6 : 6.41 22.45 1.40 0.00 0.00 1176756.97 850.41 1989298.47 00:08:36.202 Job: Malloc2p6 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p6 : 6.47 22.27 1.39 0.00 0.00 1188216.59 651.46 2037233.37 00:08:36.202 Job: Malloc2p7 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x20 00:08:36.202 Malloc2p7 : 6.42 22.45 1.40 0.00 0.00 1164397.70 651.46 1965331.02 00:08:36.202 Job: Malloc2p7 (Core Mask 0x2, workload: verify, depth: 32, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x20 length 0x20 00:08:36.202 Malloc2p7 : 6.47 22.26 1.39 0.00 0.00 1176129.06 651.46 2005276.77 00:08:36.202 Job: TestPT (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x100 00:08:36.202 TestPT : 6.99 32.34 2.02 0.00 0.00 3055571.67 115343.36 3930662.03 00:08:36.202 Job: TestPT (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x100 length 0x100 00:08:36.202 TestPT : 7.03 31.84 1.99 0.00 0.00 3110808.90 97367.77 3962618.64 00:08:36.202 Job: raid0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x200 00:08:36.202 raid0 : 7.12 38.22 2.39 0.00 0.00 2517799.40 1661.81 4761533.68 00:08:36.202 Job: raid0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x200 length 0x200 00:08:36.202 raid0 : 6.87 37.28 2.33 0.00 0.00 2571589.32 1646.20 4825446.89 00:08:36.202 Job: concat0 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x200 00:08:36.202 concat0 : 6.99 51.22 3.20 0.00 0.00 1864303.44 1638.40 4569794.07 00:08:36.202 Job: concat0 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x200 length 0x200 00:08:36.202 concat0 : 6.97 46.62 2.91 0.00 0.00 2033893.23 1669.61 4601750.67 00:08:36.202 Job: raid1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x100 00:08:36.202 raid1 : 6.99 58.49 3.66 0.00 0.00 1573370.63 2309.36 4346097.86 00:08:36.202 Job: raid1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x100 length 0x100 00:08:36.202 raid1 : 7.04 58.12 3.63 0.00 0.00 1580582.64 2278.16 4378054.46 00:08:36.202 Job: AIO0 (Core Mask 0x1, workload: verify, depth: 78, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x0 length 0x4e 00:08:36.202 AIO0 : 7.12 55.63 3.48 0.00 0.00 983096.59 643.66 3099790.38 00:08:36.202 Job: AIO0 (Core Mask 0x2, workload: verify, depth: 78, IO size: 65536) 00:08:36.202 Verification LBA range: start 0x4e length 0x4e 00:08:36.202 AIO0 : 7.13 59.74 3.73 0.00 0.00 915432.20 807.50 3211638.49 00:08:36.202 =================================================================================================================== 00:08:36.202 Total : 1251.55 78.22 0.00 0.00 1629623.30 643.66 5688275.14 00:08:36.202 00:08:36.202 real 0m8.296s 00:08:36.202 user 0m15.762s 00:08:36.202 sys 0m0.344s 00:08:36.202 15:47:41 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:36.202 15:47:41 blockdev_general.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:08:36.202 ************************************ 00:08:36.202 END TEST bdev_verify_big_io 00:08:36.202 ************************************ 00:08:36.202 15:47:41 blockdev_general -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.202 15:47:41 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:36.202 15:47:41 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:36.202 15:47:41 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:36.202 ************************************ 00:08:36.202 START TEST bdev_write_zeroes 00:08:36.202 ************************************ 00:08:36.202 15:47:41 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:36.202 [2024-06-10 15:47:41.153831] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:36.202 [2024-06-10 15:47:41.153885] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2619962 ] 00:08:36.202 [2024-06-10 15:47:41.253526] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:36.202 [2024-06-10 15:47:41.345459] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:36.203 [2024-06-10 15:47:41.494132] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.203 [2024-06-10 15:47:41.494182] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:08:36.203 [2024-06-10 15:47:41.494194] vbdev_passthru.c: 735:bdev_passthru_create_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:08:36.203 [2024-06-10 15:47:41.502143] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.203 [2024-06-10 15:47:41.502168] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:08:36.203 [2024-06-10 15:47:41.510154] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.203 [2024-06-10 15:47:41.510176] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:08:36.203 [2024-06-10 15:47:41.581947] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on Malloc3 00:08:36.203 [2024-06-10 15:47:41.582003] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:08:36.203 [2024-06-10 15:47:41.582019] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x15fa150 00:08:36.203 [2024-06-10 15:47:41.582029] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:08:36.203 [2024-06-10 15:47:41.583514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:08:36.203 [2024-06-10 15:47:41.583541] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: TestPT 00:08:36.462 Running I/O for 1 seconds... 00:08:37.399 00:08:37.399 Latency(us) 00:08:37.399 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:08:37.399 Job: Malloc0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc0 : 1.05 4643.33 18.14 0.00 0.00 27551.56 690.47 46187.28 00:08:37.399 Job: Malloc1p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc1p0 : 1.05 4636.15 18.11 0.00 0.00 27542.58 967.44 45188.63 00:08:37.399 Job: Malloc1p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc1p1 : 1.05 4629.04 18.08 0.00 0.00 27521.58 959.63 44189.99 00:08:37.399 Job: Malloc2p0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p0 : 1.05 4621.90 18.05 0.00 0.00 27498.99 959.63 43191.34 00:08:37.399 Job: Malloc2p1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p1 : 1.05 4614.86 18.03 0.00 0.00 27480.31 963.54 42192.70 00:08:37.399 Job: Malloc2p2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p2 : 1.06 4607.77 18.00 0.00 0.00 27462.12 955.73 41443.72 00:08:37.399 Job: Malloc2p3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p3 : 1.06 4600.71 17.97 0.00 0.00 27434.30 959.63 40445.07 00:08:37.399 Job: Malloc2p4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p4 : 1.06 4593.68 17.94 0.00 0.00 27408.34 955.73 39196.77 00:08:37.399 Job: Malloc2p5 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p5 : 1.06 4586.70 17.92 0.00 0.00 27384.74 959.63 38198.13 00:08:37.399 Job: Malloc2p6 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p6 : 1.06 4579.69 17.89 0.00 0.00 27357.78 959.63 37199.48 00:08:37.399 Job: Malloc2p7 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.399 Malloc2p7 : 1.06 4572.76 17.86 0.00 0.00 27337.48 955.73 36200.84 00:08:37.399 Job: TestPT (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.400 TestPT : 1.07 4565.82 17.84 0.00 0.00 27312.04 1006.45 35202.19 00:08:37.400 Job: raid0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.400 raid0 : 1.07 4557.80 17.80 0.00 0.00 27274.13 1763.23 33454.57 00:08:37.400 Job: concat0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.400 concat0 : 1.07 4549.94 17.77 0.00 0.00 27209.89 1755.43 31706.94 00:08:37.400 Job: raid1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.400 raid1 : 1.07 4539.93 17.73 0.00 0.00 27141.90 2777.48 28835.84 00:08:37.400 Job: AIO0 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:08:37.400 AIO0 : 1.07 4534.01 17.71 0.00 0.00 27041.75 998.64 28086.86 00:08:37.400 =================================================================================================================== 00:08:37.400 Total : 73434.10 286.85 0.00 0.00 27372.47 690.47 46187.28 00:08:37.966 00:08:37.966 real 0m2.102s 00:08:37.966 user 0m1.775s 00:08:37.966 sys 0m0.280s 00:08:37.966 15:47:43 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:37.966 15:47:43 blockdev_general.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:08:37.966 ************************************ 00:08:37.966 END TEST bdev_write_zeroes 00:08:37.966 ************************************ 00:08:37.966 15:47:43 blockdev_general -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:37.966 15:47:43 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:37.966 15:47:43 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:37.966 15:47:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:37.966 ************************************ 00:08:37.966 START TEST bdev_json_nonenclosed 00:08:37.966 ************************************ 00:08:37.966 15:47:43 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:37.966 [2024-06-10 15:47:43.328433] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:37.966 [2024-06-10 15:47:43.328486] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2620340 ] 00:08:37.966 [2024-06-10 15:47:43.427004] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.225 [2024-06-10 15:47:43.517478] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.225 [2024-06-10 15:47:43.517541] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:08:38.225 [2024-06-10 15:47:43.517558] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:38.225 [2024-06-10 15:47:43.517567] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:38.225 00:08:38.225 real 0m0.338s 00:08:38.225 user 0m0.228s 00:08:38.225 sys 0m0.108s 00:08:38.225 15:47:43 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:38.225 15:47:43 blockdev_general.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:08:38.225 ************************************ 00:08:38.225 END TEST bdev_json_nonenclosed 00:08:38.225 ************************************ 00:08:38.225 15:47:43 blockdev_general -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:38.225 15:47:43 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:08:38.225 15:47:43 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:38.225 15:47:43 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.225 ************************************ 00:08:38.225 START TEST bdev_json_nonarray 00:08:38.225 ************************************ 00:08:38.225 15:47:43 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:08:38.484 [2024-06-10 15:47:43.739083] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:38.484 [2024-06-10 15:47:43.739137] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2620363 ] 00:08:38.484 [2024-06-10 15:47:43.836433] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:38.484 [2024-06-10 15:47:43.927343] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:08:38.484 [2024-06-10 15:47:43.927416] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:08:38.484 [2024-06-10 15:47:43.927433] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:08:38.484 [2024-06-10 15:47:43.927442] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:08:38.743 00:08:38.743 real 0m0.339s 00:08:38.743 user 0m0.218s 00:08:38.743 sys 0m0.118s 00:08:38.743 15:47:44 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:38.743 15:47:44 blockdev_general.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:08:38.743 ************************************ 00:08:38.743 END TEST bdev_json_nonarray 00:08:38.743 ************************************ 00:08:38.743 15:47:44 blockdev_general -- bdev/blockdev.sh@787 -- # [[ bdev == bdev ]] 00:08:38.743 15:47:44 blockdev_general -- bdev/blockdev.sh@788 -- # run_test bdev_qos qos_test_suite '' 00:08:38.743 15:47:44 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:08:38.743 15:47:44 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:38.743 15:47:44 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:08:38.743 ************************************ 00:08:38.743 START TEST bdev_qos 00:08:38.743 ************************************ 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@1124 -- # qos_test_suite '' 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@446 -- # QOS_PID=2620596 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@447 -- # echo 'Process qos testing pid: 2620596' 00:08:38.743 Process qos testing pid: 2620596 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@445 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 256 -o 4096 -w randread -t 60 '' 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@448 -- # trap 'cleanup; killprocess $QOS_PID; exit 1' SIGINT SIGTERM EXIT 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- bdev/blockdev.sh@449 -- # waitforlisten 2620596 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@830 -- # '[' -z 2620596 ']' 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@835 -- # local max_retries=100 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:08:38.743 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@839 -- # xtrace_disable 00:08:38.743 15:47:44 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:38.743 [2024-06-10 15:47:44.149032] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:08:38.743 [2024-06-10 15:47:44.149104] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2620596 ] 00:08:38.743 [2024-06-10 15:47:44.241449] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:08:39.002 [2024-06-10 15:47:44.334289] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@863 -- # return 0 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@451 -- # rpc_cmd bdev_malloc_create -b Malloc_0 128 512 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.938 Malloc_0 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@452 -- # waitforbdev Malloc_0 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_0 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:39.938 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_0 -t 2000 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.939 [ 00:08:39.939 { 00:08:39.939 "name": "Malloc_0", 00:08:39.939 "aliases": [ 00:08:39.939 "5bf77978-25e1-4d7e-97a1-0a3e665f907a" 00:08:39.939 ], 00:08:39.939 "product_name": "Malloc disk", 00:08:39.939 "block_size": 512, 00:08:39.939 "num_blocks": 262144, 00:08:39.939 "uuid": "5bf77978-25e1-4d7e-97a1-0a3e665f907a", 00:08:39.939 "assigned_rate_limits": { 00:08:39.939 "rw_ios_per_sec": 0, 00:08:39.939 "rw_mbytes_per_sec": 0, 00:08:39.939 "r_mbytes_per_sec": 0, 00:08:39.939 "w_mbytes_per_sec": 0 00:08:39.939 }, 00:08:39.939 "claimed": false, 00:08:39.939 "zoned": false, 00:08:39.939 "supported_io_types": { 00:08:39.939 "read": true, 00:08:39.939 "write": true, 00:08:39.939 "unmap": true, 00:08:39.939 "write_zeroes": true, 00:08:39.939 "flush": true, 00:08:39.939 "reset": true, 00:08:39.939 "compare": false, 00:08:39.939 "compare_and_write": false, 00:08:39.939 "abort": true, 00:08:39.939 "nvme_admin": false, 00:08:39.939 "nvme_io": false 00:08:39.939 }, 00:08:39.939 "memory_domains": [ 00:08:39.939 { 00:08:39.939 "dma_device_id": "system", 00:08:39.939 "dma_device_type": 1 00:08:39.939 }, 00:08:39.939 { 00:08:39.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:08:39.939 "dma_device_type": 2 00:08:39.939 } 00:08:39.939 ], 00:08:39.939 "driver_specific": {} 00:08:39.939 } 00:08:39.939 ] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@453 -- # rpc_cmd bdev_null_create Null_1 128 512 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.939 Null_1 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@454 -- # waitforbdev Null_1 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@898 -- # local bdev_name=Null_1 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@900 -- # local i 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Null_1 -t 2000 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:39.939 [ 00:08:39.939 { 00:08:39.939 "name": "Null_1", 00:08:39.939 "aliases": [ 00:08:39.939 "5c51640e-143b-4b2a-af1e-83e7d169cbd7" 00:08:39.939 ], 00:08:39.939 "product_name": "Null disk", 00:08:39.939 "block_size": 512, 00:08:39.939 "num_blocks": 262144, 00:08:39.939 "uuid": "5c51640e-143b-4b2a-af1e-83e7d169cbd7", 00:08:39.939 "assigned_rate_limits": { 00:08:39.939 "rw_ios_per_sec": 0, 00:08:39.939 "rw_mbytes_per_sec": 0, 00:08:39.939 "r_mbytes_per_sec": 0, 00:08:39.939 "w_mbytes_per_sec": 0 00:08:39.939 }, 00:08:39.939 "claimed": false, 00:08:39.939 "zoned": false, 00:08:39.939 "supported_io_types": { 00:08:39.939 "read": true, 00:08:39.939 "write": true, 00:08:39.939 "unmap": false, 00:08:39.939 "write_zeroes": true, 00:08:39.939 "flush": false, 00:08:39.939 "reset": true, 00:08:39.939 "compare": false, 00:08:39.939 "compare_and_write": false, 00:08:39.939 "abort": true, 00:08:39.939 "nvme_admin": false, 00:08:39.939 "nvme_io": false 00:08:39.939 }, 00:08:39.939 "driver_specific": {} 00:08:39.939 } 00:08:39.939 ] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- common/autotest_common.sh@906 -- # return 0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@457 -- # qos_function_test 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@410 -- # local qos_lower_iops_limit=1000 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@456 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@411 -- # local qos_lower_bw_limit=2 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@412 -- # local io_result=0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@413 -- # local iops_limit=0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@414 -- # local bw_limit=0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # get_io_result IOPS Malloc_0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:39.939 15:47:45 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:39.939 Running I/O for 60 seconds... 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 58009.51 232038.03 0.00 0.00 233472.00 0.00 0.00 ' 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@380 -- # iostat_result=58009.51 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 58009 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@416 -- # io_result=58009 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@418 -- # iops_limit=14000 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@419 -- # '[' 14000 -gt 1000 ']' 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@422 -- # rpc_cmd bdev_set_qos_limit --rw_ios_per_sec 14000 Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- bdev/blockdev.sh@423 -- # run_test bdev_qos_iops run_qos_test 14000 IOPS Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:45.214 15:47:50 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:45.214 ************************************ 00:08:45.214 START TEST bdev_qos_iops 00:08:45.214 ************************************ 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1124 -- # run_qos_test 14000 IOPS Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@389 -- # local qos_limit=14000 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # get_io_result IOPS Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@375 -- # local limit_type=IOPS 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:08:45.214 15:47:50 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # tail -1 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 14002.19 56008.75 0.00 0.00 56784.00 0.00 0.00 ' 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@379 -- # '[' IOPS = IOPS ']' 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # awk '{print $2}' 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@380 -- # iostat_result=14002.19 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@385 -- # echo 14002 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@392 -- # qos_result=14002 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@393 -- # '[' IOPS = BANDWIDTH ']' 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@396 -- # lower_limit=12600 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@397 -- # upper_limit=15400 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14002 -lt 12600 ']' 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- bdev/blockdev.sh@400 -- # '[' 14002 -gt 15400 ']' 00:08:50.487 00:08:50.487 real 0m5.240s 00:08:50.487 user 0m0.117s 00:08:50.487 sys 0m0.042s 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@1125 -- # xtrace_disable 00:08:50.487 15:47:55 blockdev_general.bdev_qos.bdev_qos_iops -- common/autotest_common.sh@10 -- # set +x 00:08:50.487 ************************************ 00:08:50.487 END TEST bdev_qos_iops 00:08:50.487 ************************************ 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # get_io_result BANDWIDTH Null_1 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:50.487 15:47:55 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # tail -1 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 18141.08 72564.32 0.00 0.00 73728.00 0.00 0.00 ' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@382 -- # iostat_result=73728.00 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@385 -- # echo 73728 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@427 -- # bw_limit=73728 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@428 -- # bw_limit=7 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@429 -- # '[' 7 -lt 2 ']' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@432 -- # rpc_cmd bdev_set_qos_limit --rw_mbytes_per_sec 7 Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- bdev/blockdev.sh@433 -- # run_test bdev_qos_bw run_qos_test 7 BANDWIDTH Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:08:55.761 15:48:00 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:08:55.761 ************************************ 00:08:55.761 START TEST bdev_qos_bw 00:08:55.761 ************************************ 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1124 -- # run_qos_test 7 BANDWIDTH Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@389 -- # local qos_limit=7 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # grep Null_1 00:08:55.761 15:48:00 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@378 -- # iostat_result='Null_1 1792.51 7170.05 0.00 0.00 7356.00 0.00 0.00 ' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@382 -- # iostat_result=7356.00 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@385 -- # echo 7356 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@392 -- # qos_result=7356 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@394 -- # qos_limit=7168 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@396 -- # lower_limit=6451 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@397 -- # upper_limit=7884 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7356 -lt 6451 ']' 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- bdev/blockdev.sh@400 -- # '[' 7356 -gt 7884 ']' 00:09:01.034 00:09:01.034 real 0m5.292s 00:09:01.034 user 0m0.117s 00:09:01.034 sys 0m0.043s 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_bw -- common/autotest_common.sh@10 -- # set +x 00:09:01.034 ************************************ 00:09:01.034 END TEST bdev_qos_bw 00:09:01.034 ************************************ 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@436 -- # rpc_cmd bdev_set_qos_limit --r_mbytes_per_sec 2 Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- bdev/blockdev.sh@437 -- # run_test bdev_qos_ro_bw run_qos_test 2 BANDWIDTH Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:01.034 15:48:06 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:01.034 ************************************ 00:09:01.034 START TEST bdev_qos_ro_bw 00:09:01.034 ************************************ 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1124 -- # run_qos_test 2 BANDWIDTH Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@389 -- # local qos_limit=2 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@390 -- # local qos_result=0 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # get_io_result BANDWIDTH Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@375 -- # local limit_type=BANDWIDTH 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@376 -- # local qos_dev=Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@377 -- # local iostat_result 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/iostat.py -d -i 1 -t 5 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # grep Malloc_0 00:09:01.034 15:48:06 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # tail -1 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@378 -- # iostat_result='Malloc_0 512.16 2048.64 0.00 0.00 2056.00 0.00 0.00 ' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@379 -- # '[' BANDWIDTH = IOPS ']' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@381 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # awk '{print $6}' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@382 -- # iostat_result=2056.00 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@385 -- # echo 2056 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@392 -- # qos_result=2056 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@393 -- # '[' BANDWIDTH = BANDWIDTH ']' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@394 -- # qos_limit=2048 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@396 -- # lower_limit=1843 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@397 -- # upper_limit=2252 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2056 -lt 1843 ']' 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- bdev/blockdev.sh@400 -- # '[' 2056 -gt 2252 ']' 00:09:06.347 00:09:06.347 real 0m5.185s 00:09:06.347 user 0m0.122s 00:09:06.347 sys 0m0.037s 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:06.347 15:48:11 blockdev_general.bdev_qos.bdev_qos_ro_bw -- common/autotest_common.sh@10 -- # set +x 00:09:06.347 ************************************ 00:09:06.347 END TEST bdev_qos_ro_bw 00:09:06.347 ************************************ 00:09:06.347 15:48:11 blockdev_general.bdev_qos -- bdev/blockdev.sh@459 -- # rpc_cmd bdev_malloc_delete Malloc_0 00:09:06.347 15:48:11 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:06.347 15:48:11 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@460 -- # rpc_cmd bdev_null_delete Null_1 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:06.915 00:09:06.915 Latency(us) 00:09:06.915 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:06.915 Job: Malloc_0 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:06.915 Malloc_0 : 26.75 19417.03 75.85 0.00 0.00 13058.70 2137.72 503316.48 00:09:06.915 Job: Null_1 (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:06.915 Null_1 : 26.91 18763.35 73.29 0.00 0.00 13602.55 850.41 163777.58 00:09:06.915 =================================================================================================================== 00:09:06.915 Total : 38180.38 149.14 0.00 0.00 13326.81 850.41 503316.48 00:09:06.915 0 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@461 -- # killprocess 2620596 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@949 -- # '[' -z 2620596 ']' 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@953 -- # kill -0 2620596 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # uname 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2620596 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2620596' 00:09:06.915 killing process with pid 2620596 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@968 -- # kill 2620596 00:09:06.915 Received shutdown signal, test time was about 26.966694 seconds 00:09:06.915 00:09:06.915 Latency(us) 00:09:06.915 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:06.915 =================================================================================================================== 00:09:06.915 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:06.915 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@973 -- # wait 2620596 00:09:07.175 15:48:12 blockdev_general.bdev_qos -- bdev/blockdev.sh@462 -- # trap - SIGINT SIGTERM EXIT 00:09:07.175 00:09:07.175 real 0m28.460s 00:09:07.175 user 0m29.307s 00:09:07.175 sys 0m0.700s 00:09:07.175 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:07.175 15:48:12 blockdev_general.bdev_qos -- common/autotest_common.sh@10 -- # set +x 00:09:07.175 ************************************ 00:09:07.175 END TEST bdev_qos 00:09:07.175 ************************************ 00:09:07.175 15:48:12 blockdev_general -- bdev/blockdev.sh@789 -- # run_test bdev_qd_sampling qd_sampling_test_suite '' 00:09:07.175 15:48:12 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:07.175 15:48:12 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:07.175 15:48:12 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:07.175 ************************************ 00:09:07.175 START TEST bdev_qd_sampling 00:09:07.175 ************************************ 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1124 -- # qd_sampling_test_suite '' 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@538 -- # QD_DEV=Malloc_QD 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@541 -- # QD_PID=2625760 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@542 -- # echo 'Process bdev QD sampling period testing pid: 2625760' 00:09:07.175 Process bdev QD sampling period testing pid: 2625760 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@540 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 5 -C '' 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@543 -- # trap 'cleanup; killprocess $QD_PID; exit 1' SIGINT SIGTERM EXIT 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@544 -- # waitforlisten 2625760 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@830 -- # '[' -z 2625760 ']' 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:07.175 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:07.175 15:48:12 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:07.175 [2024-06-10 15:48:12.681126] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:07.175 [2024-06-10 15:48:12.681183] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2625760 ] 00:09:07.434 [2024-06-10 15:48:12.780394] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:07.434 [2024-06-10 15:48:12.876840] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:07.434 [2024-06-10 15:48:12.876847] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@863 -- # return 0 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@546 -- # rpc_cmd bdev_malloc_create -b Malloc_QD 128 512 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.368 Malloc_QD 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@547 -- # waitforbdev Malloc_QD 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_QD 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@900 -- # local i 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_QD -t 2000 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:08.368 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:08.368 [ 00:09:08.368 { 00:09:08.368 "name": "Malloc_QD", 00:09:08.368 "aliases": [ 00:09:08.368 "373126b6-87db-4323-a1ae-3e61bf0f6ad8" 00:09:08.368 ], 00:09:08.368 "product_name": "Malloc disk", 00:09:08.368 "block_size": 512, 00:09:08.368 "num_blocks": 262144, 00:09:08.368 "uuid": "373126b6-87db-4323-a1ae-3e61bf0f6ad8", 00:09:08.368 "assigned_rate_limits": { 00:09:08.368 "rw_ios_per_sec": 0, 00:09:08.368 "rw_mbytes_per_sec": 0, 00:09:08.368 "r_mbytes_per_sec": 0, 00:09:08.368 "w_mbytes_per_sec": 0 00:09:08.368 }, 00:09:08.368 "claimed": false, 00:09:08.368 "zoned": false, 00:09:08.368 "supported_io_types": { 00:09:08.368 "read": true, 00:09:08.368 "write": true, 00:09:08.368 "unmap": true, 00:09:08.368 "write_zeroes": true, 00:09:08.368 "flush": true, 00:09:08.368 "reset": true, 00:09:08.368 "compare": false, 00:09:08.368 "compare_and_write": false, 00:09:08.368 "abort": true, 00:09:08.368 "nvme_admin": false, 00:09:08.368 "nvme_io": false 00:09:08.368 }, 00:09:08.368 "memory_domains": [ 00:09:08.368 { 00:09:08.368 "dma_device_id": "system", 00:09:08.368 "dma_device_type": 1 00:09:08.369 }, 00:09:08.369 { 00:09:08.369 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:08.369 "dma_device_type": 2 00:09:08.369 } 00:09:08.369 ], 00:09:08.369 "driver_specific": {} 00:09:08.369 } 00:09:08.369 ] 00:09:08.369 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:08.369 15:48:13 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@906 -- # return 0 00:09:08.369 15:48:13 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@550 -- # sleep 2 00:09:08.369 15:48:13 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@549 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:08.369 Running I/O for 5 seconds... 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@551 -- # qd_sampling_function_test Malloc_QD 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@519 -- # local bdev_name=Malloc_QD 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@520 -- # local sampling_period=10 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@521 -- # local iostats 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@523 -- # rpc_cmd bdev_set_qd_sampling_period Malloc_QD 10 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # rpc_cmd bdev_get_iostat -b Malloc_QD 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@525 -- # iostats='{ 00:09:10.271 "tick_rate": 2100000000, 00:09:10.271 "ticks": 3856026600176502, 00:09:10.271 "bdevs": [ 00:09:10.271 { 00:09:10.271 "name": "Malloc_QD", 00:09:10.271 "bytes_read": 720417280, 00:09:10.271 "num_read_ops": 175876, 00:09:10.271 "bytes_written": 0, 00:09:10.271 "num_write_ops": 0, 00:09:10.271 "bytes_unmapped": 0, 00:09:10.271 "num_unmap_ops": 0, 00:09:10.271 "bytes_copied": 0, 00:09:10.271 "num_copy_ops": 0, 00:09:10.271 "read_latency_ticks": 2046887232398, 00:09:10.271 "max_read_latency_ticks": 13687374, 00:09:10.271 "min_read_latency_ticks": 228714, 00:09:10.271 "write_latency_ticks": 0, 00:09:10.271 "max_write_latency_ticks": 0, 00:09:10.271 "min_write_latency_ticks": 0, 00:09:10.271 "unmap_latency_ticks": 0, 00:09:10.271 "max_unmap_latency_ticks": 0, 00:09:10.271 "min_unmap_latency_ticks": 0, 00:09:10.271 "copy_latency_ticks": 0, 00:09:10.271 "max_copy_latency_ticks": 0, 00:09:10.271 "min_copy_latency_ticks": 0, 00:09:10.271 "io_error": {}, 00:09:10.271 "queue_depth_polling_period": 10, 00:09:10.271 "queue_depth": 512, 00:09:10.271 "io_time": 20, 00:09:10.271 "weighted_io_time": 10240 00:09:10.271 } 00:09:10.271 ] 00:09:10.271 }' 00:09:10.271 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # jq -r '.bdevs[0].queue_depth_polling_period' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@527 -- # qd_sampling_period=10 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 == null ']' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@529 -- # '[' 10 -ne 10 ']' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@553 -- # rpc_cmd bdev_malloc_delete Malloc_QD 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.530 00:09:10.530 Latency(us) 00:09:10.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:10.530 Job: Malloc_QD (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:10.530 Malloc_QD : 1.98 45816.40 178.97 0.00 0.00 5572.99 1560.38 5991.86 00:09:10.530 Job: Malloc_QD (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:10.530 Malloc_QD : 1.99 46422.62 181.34 0.00 0.00 5500.93 1006.45 6522.39 00:09:10.530 =================================================================================================================== 00:09:10.530 Total : 92239.03 360.31 0.00 0.00 5536.71 1006.45 6522.39 00:09:10.530 0 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@554 -- # killprocess 2625760 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@949 -- # '[' -z 2625760 ']' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@953 -- # kill -0 2625760 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # uname 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2625760 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2625760' 00:09:10.530 killing process with pid 2625760 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@968 -- # kill 2625760 00:09:10.530 Received shutdown signal, test time was about 2.055588 seconds 00:09:10.530 00:09:10.530 Latency(us) 00:09:10.530 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:10.530 =================================================================================================================== 00:09:10.530 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:10.530 15:48:15 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@973 -- # wait 2625760 00:09:10.789 15:48:16 blockdev_general.bdev_qd_sampling -- bdev/blockdev.sh@555 -- # trap - SIGINT SIGTERM EXIT 00:09:10.789 00:09:10.789 real 0m3.448s 00:09:10.789 user 0m6.901s 00:09:10.789 sys 0m0.353s 00:09:10.789 15:48:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:10.789 15:48:16 blockdev_general.bdev_qd_sampling -- common/autotest_common.sh@10 -- # set +x 00:09:10.789 ************************************ 00:09:10.789 END TEST bdev_qd_sampling 00:09:10.789 ************************************ 00:09:10.789 15:48:16 blockdev_general -- bdev/blockdev.sh@790 -- # run_test bdev_error error_test_suite '' 00:09:10.789 15:48:16 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:10.789 15:48:16 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:10.789 15:48:16 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:10.789 ************************************ 00:09:10.789 START TEST bdev_error 00:09:10.789 ************************************ 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@1124 -- # error_test_suite '' 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@466 -- # DEV_1=Dev_1 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@467 -- # DEV_2=Dev_2 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@468 -- # ERR_DEV=EE_Dev_1 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@472 -- # ERR_PID=2626452 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@473 -- # echo 'Process error testing pid: 2626452' 00:09:10.789 Process error testing pid: 2626452 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 -f '' 00:09:10.789 15:48:16 blockdev_general.bdev_error -- bdev/blockdev.sh@474 -- # waitforlisten 2626452 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 2626452 ']' 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:10.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:10.789 15:48:16 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:10.789 [2024-06-10 15:48:16.194904] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:10.789 [2024-06-10 15:48:16.194970] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2626452 ] 00:09:10.789 [2024-06-10 15:48:16.288398] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:11.048 [2024-06-10 15:48:16.380616] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:09:11.985 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@476 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.985 Dev_1 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@477 -- # waitforbdev Dev_1 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.985 [ 00:09:11.985 { 00:09:11.985 "name": "Dev_1", 00:09:11.985 "aliases": [ 00:09:11.985 "773ca674-4e2b-4a61-a460-a32952f2f186" 00:09:11.985 ], 00:09:11.985 "product_name": "Malloc disk", 00:09:11.985 "block_size": 512, 00:09:11.985 "num_blocks": 262144, 00:09:11.985 "uuid": "773ca674-4e2b-4a61-a460-a32952f2f186", 00:09:11.985 "assigned_rate_limits": { 00:09:11.985 "rw_ios_per_sec": 0, 00:09:11.985 "rw_mbytes_per_sec": 0, 00:09:11.985 "r_mbytes_per_sec": 0, 00:09:11.985 "w_mbytes_per_sec": 0 00:09:11.985 }, 00:09:11.985 "claimed": false, 00:09:11.985 "zoned": false, 00:09:11.985 "supported_io_types": { 00:09:11.985 "read": true, 00:09:11.985 "write": true, 00:09:11.985 "unmap": true, 00:09:11.985 "write_zeroes": true, 00:09:11.985 "flush": true, 00:09:11.985 "reset": true, 00:09:11.985 "compare": false, 00:09:11.985 "compare_and_write": false, 00:09:11.985 "abort": true, 00:09:11.985 "nvme_admin": false, 00:09:11.985 "nvme_io": false 00:09:11.985 }, 00:09:11.985 "memory_domains": [ 00:09:11.985 { 00:09:11.985 "dma_device_id": "system", 00:09:11.985 "dma_device_type": 1 00:09:11.985 }, 00:09:11.985 { 00:09:11.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:11.985 "dma_device_type": 2 00:09:11.985 } 00:09:11.985 ], 00:09:11.985 "driver_specific": {} 00:09:11.985 } 00:09:11.985 ] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:11.985 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@478 -- # rpc_cmd bdev_error_create Dev_1 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.985 true 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@479 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.985 Dev_2 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.985 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@480 -- # waitforbdev Dev_2 00:09:11.985 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.986 [ 00:09:11.986 { 00:09:11.986 "name": "Dev_2", 00:09:11.986 "aliases": [ 00:09:11.986 "8ed4917b-3a94-4a97-acf1-6cb1cf32f332" 00:09:11.986 ], 00:09:11.986 "product_name": "Malloc disk", 00:09:11.986 "block_size": 512, 00:09:11.986 "num_blocks": 262144, 00:09:11.986 "uuid": "8ed4917b-3a94-4a97-acf1-6cb1cf32f332", 00:09:11.986 "assigned_rate_limits": { 00:09:11.986 "rw_ios_per_sec": 0, 00:09:11.986 "rw_mbytes_per_sec": 0, 00:09:11.986 "r_mbytes_per_sec": 0, 00:09:11.986 "w_mbytes_per_sec": 0 00:09:11.986 }, 00:09:11.986 "claimed": false, 00:09:11.986 "zoned": false, 00:09:11.986 "supported_io_types": { 00:09:11.986 "read": true, 00:09:11.986 "write": true, 00:09:11.986 "unmap": true, 00:09:11.986 "write_zeroes": true, 00:09:11.986 "flush": true, 00:09:11.986 "reset": true, 00:09:11.986 "compare": false, 00:09:11.986 "compare_and_write": false, 00:09:11.986 "abort": true, 00:09:11.986 "nvme_admin": false, 00:09:11.986 "nvme_io": false 00:09:11.986 }, 00:09:11.986 "memory_domains": [ 00:09:11.986 { 00:09:11.986 "dma_device_id": "system", 00:09:11.986 "dma_device_type": 1 00:09:11.986 }, 00:09:11.986 { 00:09:11.986 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:11.986 "dma_device_type": 2 00:09:11.986 } 00:09:11.986 ], 00:09:11.986 "driver_specific": {} 00:09:11.986 } 00:09:11.986 ] 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:11.986 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@481 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:11.986 15:48:17 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:11.986 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@484 -- # sleep 1 00:09:11.986 15:48:17 blockdev_general.bdev_error -- bdev/blockdev.sh@483 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:11.986 Running I/O for 5 seconds... 00:09:12.922 15:48:18 blockdev_general.bdev_error -- bdev/blockdev.sh@487 -- # kill -0 2626452 00:09:12.922 15:48:18 blockdev_general.bdev_error -- bdev/blockdev.sh@488 -- # echo 'Process is existed as continue on error is set. Pid: 2626452' 00:09:12.922 Process is existed as continue on error is set. Pid: 2626452 00:09:12.922 15:48:18 blockdev_general.bdev_error -- bdev/blockdev.sh@495 -- # rpc_cmd bdev_error_delete EE_Dev_1 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:12.922 15:48:18 blockdev_general.bdev_error -- bdev/blockdev.sh@496 -- # rpc_cmd bdev_malloc_delete Dev_1 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:12.922 15:48:18 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:12.922 15:48:18 blockdev_general.bdev_error -- bdev/blockdev.sh@497 -- # sleep 5 00:09:12.922 Timeout while waiting for response: 00:09:12.922 00:09:12.922 00:09:17.113 00:09:17.113 Latency(us) 00:09:17.113 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:17.113 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:17.113 EE_Dev_1 : 0.89 34692.97 135.52 5.60 0.00 457.25 141.41 717.78 00:09:17.113 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:17.113 Dev_2 : 5.00 75954.96 296.70 0.00 0.00 206.86 76.07 19972.88 00:09:17.113 =================================================================================================================== 00:09:17.113 Total : 110647.92 432.22 5.60 0.00 225.74 76.07 19972.88 00:09:18.049 15:48:23 blockdev_general.bdev_error -- bdev/blockdev.sh@499 -- # killprocess 2626452 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@949 -- # '[' -z 2626452 ']' 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@953 -- # kill -0 2626452 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # uname 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2626452 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:09:18.049 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2626452' 00:09:18.049 killing process with pid 2626452 00:09:18.050 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@968 -- # kill 2626452 00:09:18.050 Received shutdown signal, test time was about 5.000000 seconds 00:09:18.050 00:09:18.050 Latency(us) 00:09:18.050 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:18.050 =================================================================================================================== 00:09:18.050 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:18.050 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@973 -- # wait 2626452 00:09:18.309 15:48:23 blockdev_general.bdev_error -- bdev/blockdev.sh@503 -- # ERR_PID=2627602 00:09:18.309 15:48:23 blockdev_general.bdev_error -- bdev/blockdev.sh@504 -- # echo 'Process error testing pid: 2627602' 00:09:18.309 Process error testing pid: 2627602 00:09:18.309 15:48:23 blockdev_general.bdev_error -- bdev/blockdev.sh@502 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 16 -o 4096 -w randread -t 5 '' 00:09:18.309 15:48:23 blockdev_general.bdev_error -- bdev/blockdev.sh@505 -- # waitforlisten 2627602 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@830 -- # '[' -z 2627602 ']' 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:18.309 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:18.309 15:48:23 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:18.309 [2024-06-10 15:48:23.672579] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:18.309 [2024-06-10 15:48:23.672638] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2627602 ] 00:09:18.309 [2024-06-10 15:48:23.766306] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:18.568 [2024-06-10 15:48:23.856590] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:19.135 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:19.135 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@863 -- # return 0 00:09:19.135 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@507 -- # rpc_cmd bdev_malloc_create -b Dev_1 128 512 00:09:19.135 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.135 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 Dev_1 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@508 -- # waitforbdev Dev_1 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_1 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_1 -t 2000 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 [ 00:09:19.395 { 00:09:19.395 "name": "Dev_1", 00:09:19.395 "aliases": [ 00:09:19.395 "2cfc89e4-0293-449c-885f-7f739495a250" 00:09:19.395 ], 00:09:19.395 "product_name": "Malloc disk", 00:09:19.395 "block_size": 512, 00:09:19.395 "num_blocks": 262144, 00:09:19.395 "uuid": "2cfc89e4-0293-449c-885f-7f739495a250", 00:09:19.395 "assigned_rate_limits": { 00:09:19.395 "rw_ios_per_sec": 0, 00:09:19.395 "rw_mbytes_per_sec": 0, 00:09:19.395 "r_mbytes_per_sec": 0, 00:09:19.395 "w_mbytes_per_sec": 0 00:09:19.395 }, 00:09:19.395 "claimed": false, 00:09:19.395 "zoned": false, 00:09:19.395 "supported_io_types": { 00:09:19.395 "read": true, 00:09:19.395 "write": true, 00:09:19.395 "unmap": true, 00:09:19.395 "write_zeroes": true, 00:09:19.395 "flush": true, 00:09:19.395 "reset": true, 00:09:19.395 "compare": false, 00:09:19.395 "compare_and_write": false, 00:09:19.395 "abort": true, 00:09:19.395 "nvme_admin": false, 00:09:19.395 "nvme_io": false 00:09:19.395 }, 00:09:19.395 "memory_domains": [ 00:09:19.395 { 00:09:19.395 "dma_device_id": "system", 00:09:19.395 "dma_device_type": 1 00:09:19.395 }, 00:09:19.395 { 00:09:19.395 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.395 "dma_device_type": 2 00:09:19.395 } 00:09:19.395 ], 00:09:19.395 "driver_specific": {} 00:09:19.395 } 00:09:19.395 ] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:19.395 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@509 -- # rpc_cmd bdev_error_create Dev_1 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 true 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@510 -- # rpc_cmd bdev_malloc_create -b Dev_2 128 512 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 Dev_2 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@511 -- # waitforbdev Dev_2 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@898 -- # local bdev_name=Dev_2 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@900 -- # local i 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Dev_2 -t 2000 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.395 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.395 [ 00:09:19.395 { 00:09:19.395 "name": "Dev_2", 00:09:19.395 "aliases": [ 00:09:19.395 "445469cf-8d67-418e-8d7a-c0561c52e57d" 00:09:19.395 ], 00:09:19.395 "product_name": "Malloc disk", 00:09:19.395 "block_size": 512, 00:09:19.395 "num_blocks": 262144, 00:09:19.395 "uuid": "445469cf-8d67-418e-8d7a-c0561c52e57d", 00:09:19.395 "assigned_rate_limits": { 00:09:19.395 "rw_ios_per_sec": 0, 00:09:19.395 "rw_mbytes_per_sec": 0, 00:09:19.395 "r_mbytes_per_sec": 0, 00:09:19.395 "w_mbytes_per_sec": 0 00:09:19.395 }, 00:09:19.395 "claimed": false, 00:09:19.395 "zoned": false, 00:09:19.395 "supported_io_types": { 00:09:19.395 "read": true, 00:09:19.395 "write": true, 00:09:19.395 "unmap": true, 00:09:19.395 "write_zeroes": true, 00:09:19.395 "flush": true, 00:09:19.395 "reset": true, 00:09:19.395 "compare": false, 00:09:19.395 "compare_and_write": false, 00:09:19.395 "abort": true, 00:09:19.395 "nvme_admin": false, 00:09:19.395 "nvme_io": false 00:09:19.395 }, 00:09:19.396 "memory_domains": [ 00:09:19.396 { 00:09:19.396 "dma_device_id": "system", 00:09:19.396 "dma_device_type": 1 00:09:19.396 }, 00:09:19.396 { 00:09:19.396 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:19.396 "dma_device_type": 2 00:09:19.396 } 00:09:19.396 ], 00:09:19.396 "driver_specific": {} 00:09:19.396 } 00:09:19.396 ] 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@906 -- # return 0 00:09:19.396 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@512 -- # rpc_cmd bdev_error_inject_error EE_Dev_1 all failure -n 5 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:19.396 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@515 -- # NOT wait 2627602 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@649 -- # local es=0 00:09:19.396 15:48:24 blockdev_general.bdev_error -- bdev/blockdev.sh@514 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -t 1 perform_tests 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@651 -- # valid_exec_arg wait 2627602 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@637 -- # local arg=wait 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # type -t wait 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:09:19.396 15:48:24 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # wait 2627602 00:09:19.396 Running I/O for 5 seconds... 00:09:19.396 task offset: 215464 on job bdev=EE_Dev_1 fails 00:09:19.396 00:09:19.396 Latency(us) 00:09:19.396 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:19.396 Job: EE_Dev_1 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:19.396 Job: EE_Dev_1 ended in about 0.00 seconds with error 00:09:19.396 EE_Dev_1 : 0.00 27363.18 106.89 6218.91 0.00 398.20 140.43 706.07 00:09:19.396 Job: Dev_2 (Core Mask 0x2, workload: randread, depth: 16, IO size: 4096) 00:09:19.396 Dev_2 : 0.00 16806.72 65.65 0.00 0.00 711.04 134.58 1326.32 00:09:19.396 =================================================================================================================== 00:09:19.396 Total : 44169.91 172.54 6218.91 0.00 567.88 134.58 1326.32 00:09:19.396 [2024-06-10 15:48:24.882015] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:09:19.396 request: 00:09:19.396 { 00:09:19.396 "method": "perform_tests", 00:09:19.396 "req_id": 1 00:09:19.396 } 00:09:19.396 Got JSON-RPC error response 00:09:19.396 response: 00:09:19.396 { 00:09:19.396 "code": -32603, 00:09:19.396 "message": "bdevperf failed with error Operation not permitted" 00:09:19.396 } 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@652 -- # es=255 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@661 -- # es=127 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@662 -- # case "$es" in 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@669 -- # es=1 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:09:19.655 00:09:19.655 real 0m9.007s 00:09:19.655 user 0m9.605s 00:09:19.655 sys 0m0.693s 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:19.655 15:48:25 blockdev_general.bdev_error -- common/autotest_common.sh@10 -- # set +x 00:09:19.655 ************************************ 00:09:19.655 END TEST bdev_error 00:09:19.655 ************************************ 00:09:19.914 15:48:25 blockdev_general -- bdev/blockdev.sh@791 -- # run_test bdev_stat stat_test_suite '' 00:09:19.914 15:48:25 blockdev_general -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:19.914 15:48:25 blockdev_general -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:19.914 15:48:25 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:19.914 ************************************ 00:09:19.914 START TEST bdev_stat 00:09:19.914 ************************************ 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@1124 -- # stat_test_suite '' 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@592 -- # STAT_DEV=Malloc_STAT 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@596 -- # STAT_PID=2627942 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@597 -- # echo 'Process Bdev IO statistics testing pid: 2627942' 00:09:19.914 Process Bdev IO statistics testing pid: 2627942 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x3 -q 256 -o 4096 -w randread -t 10 -C '' 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@598 -- # trap 'cleanup; killprocess $STAT_PID; exit 1' SIGINT SIGTERM EXIT 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- bdev/blockdev.sh@599 -- # waitforlisten 2627942 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@830 -- # '[' -z 2627942 ']' 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:09:19.914 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:19.914 15:48:25 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:19.914 [2024-06-10 15:48:25.274249] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:19.914 [2024-06-10 15:48:25.274303] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2627942 ] 00:09:19.914 [2024-06-10 15:48:25.373727] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:09:20.173 [2024-06-10 15:48:25.469974] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:09:20.173 [2024-06-10 15:48:25.469983] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:20.741 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:20.741 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@863 -- # return 0 00:09:20.741 15:48:26 blockdev_general.bdev_stat -- bdev/blockdev.sh@601 -- # rpc_cmd bdev_malloc_create -b Malloc_STAT 128 512 00:09:20.741 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:20.741 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:21.000 Malloc_STAT 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- bdev/blockdev.sh@602 -- # waitforbdev Malloc_STAT 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@898 -- # local bdev_name=Malloc_STAT 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@900 -- # local i 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b Malloc_STAT -t 2000 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:21.000 [ 00:09:21.000 { 00:09:21.000 "name": "Malloc_STAT", 00:09:21.000 "aliases": [ 00:09:21.000 "7814798a-5e5f-4dbb-99c9-f2a7fd950459" 00:09:21.000 ], 00:09:21.000 "product_name": "Malloc disk", 00:09:21.000 "block_size": 512, 00:09:21.000 "num_blocks": 262144, 00:09:21.000 "uuid": "7814798a-5e5f-4dbb-99c9-f2a7fd950459", 00:09:21.000 "assigned_rate_limits": { 00:09:21.000 "rw_ios_per_sec": 0, 00:09:21.000 "rw_mbytes_per_sec": 0, 00:09:21.000 "r_mbytes_per_sec": 0, 00:09:21.000 "w_mbytes_per_sec": 0 00:09:21.000 }, 00:09:21.000 "claimed": false, 00:09:21.000 "zoned": false, 00:09:21.000 "supported_io_types": { 00:09:21.000 "read": true, 00:09:21.000 "write": true, 00:09:21.000 "unmap": true, 00:09:21.000 "write_zeroes": true, 00:09:21.000 "flush": true, 00:09:21.000 "reset": true, 00:09:21.000 "compare": false, 00:09:21.000 "compare_and_write": false, 00:09:21.000 "abort": true, 00:09:21.000 "nvme_admin": false, 00:09:21.000 "nvme_io": false 00:09:21.000 }, 00:09:21.000 "memory_domains": [ 00:09:21.000 { 00:09:21.000 "dma_device_id": "system", 00:09:21.000 "dma_device_type": 1 00:09:21.000 }, 00:09:21.000 { 00:09:21.000 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:21.000 "dma_device_type": 2 00:09:21.000 } 00:09:21.000 ], 00:09:21.000 "driver_specific": {} 00:09:21.000 } 00:09:21.000 ] 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- common/autotest_common.sh@906 -- # return 0 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- bdev/blockdev.sh@605 -- # sleep 2 00:09:21.000 15:48:26 blockdev_general.bdev_stat -- bdev/blockdev.sh@604 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:09:21.000 Running I/O for 10 seconds... 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@606 -- # stat_function_test Malloc_STAT 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@559 -- # local bdev_name=Malloc_STAT 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@560 -- # local iostats 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@561 -- # local io_count1 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@562 -- # local io_count2 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@563 -- # local iostats_per_channel 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@564 -- # local io_count_per_channel1 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@565 -- # local io_count_per_channel2 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@566 -- # local io_count_per_channel_all=0 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@568 -- # iostats='{ 00:09:22.905 "tick_rate": 2100000000, 00:09:22.905 "ticks": 3856052959026512, 00:09:22.905 "bdevs": [ 00:09:22.905 { 00:09:22.905 "name": "Malloc_STAT", 00:09:22.905 "bytes_read": 712028672, 00:09:22.905 "num_read_ops": 173828, 00:09:22.905 "bytes_written": 0, 00:09:22.905 "num_write_ops": 0, 00:09:22.905 "bytes_unmapped": 0, 00:09:22.905 "num_unmap_ops": 0, 00:09:22.905 "bytes_copied": 0, 00:09:22.905 "num_copy_ops": 0, 00:09:22.905 "read_latency_ticks": 2032924586164, 00:09:22.905 "max_read_latency_ticks": 13623692, 00:09:22.905 "min_read_latency_ticks": 229912, 00:09:22.905 "write_latency_ticks": 0, 00:09:22.905 "max_write_latency_ticks": 0, 00:09:22.905 "min_write_latency_ticks": 0, 00:09:22.905 "unmap_latency_ticks": 0, 00:09:22.905 "max_unmap_latency_ticks": 0, 00:09:22.905 "min_unmap_latency_ticks": 0, 00:09:22.905 "copy_latency_ticks": 0, 00:09:22.905 "max_copy_latency_ticks": 0, 00:09:22.905 "min_copy_latency_ticks": 0, 00:09:22.905 "io_error": {} 00:09:22.905 } 00:09:22.905 ] 00:09:22.905 }' 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # jq -r '.bdevs[0].num_read_ops' 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@569 -- # io_count1=173828 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT -c 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@571 -- # iostats_per_channel='{ 00:09:22.905 "tick_rate": 2100000000, 00:09:22.905 "ticks": 3856053102064808, 00:09:22.905 "name": "Malloc_STAT", 00:09:22.905 "channels": [ 00:09:22.905 { 00:09:22.905 "thread_id": 2, 00:09:22.905 "bytes_read": 365953024, 00:09:22.905 "num_read_ops": 89344, 00:09:22.905 "bytes_written": 0, 00:09:22.905 "num_write_ops": 0, 00:09:22.905 "bytes_unmapped": 0, 00:09:22.905 "num_unmap_ops": 0, 00:09:22.905 "bytes_copied": 0, 00:09:22.905 "num_copy_ops": 0, 00:09:22.905 "read_latency_ticks": 1051713819430, 00:09:22.905 "max_read_latency_ticks": 12857762, 00:09:22.905 "min_read_latency_ticks": 7441148, 00:09:22.905 "write_latency_ticks": 0, 00:09:22.905 "max_write_latency_ticks": 0, 00:09:22.905 "min_write_latency_ticks": 0, 00:09:22.905 "unmap_latency_ticks": 0, 00:09:22.905 "max_unmap_latency_ticks": 0, 00:09:22.905 "min_unmap_latency_ticks": 0, 00:09:22.905 "copy_latency_ticks": 0, 00:09:22.905 "max_copy_latency_ticks": 0, 00:09:22.905 "min_copy_latency_ticks": 0 00:09:22.905 }, 00:09:22.905 { 00:09:22.905 "thread_id": 3, 00:09:22.905 "bytes_read": 371195904, 00:09:22.905 "num_read_ops": 90624, 00:09:22.905 "bytes_written": 0, 00:09:22.905 "num_write_ops": 0, 00:09:22.905 "bytes_unmapped": 0, 00:09:22.905 "num_unmap_ops": 0, 00:09:22.905 "bytes_copied": 0, 00:09:22.905 "num_copy_ops": 0, 00:09:22.905 "read_latency_ticks": 1053321580670, 00:09:22.905 "max_read_latency_ticks": 13623692, 00:09:22.905 "min_read_latency_ticks": 7468336, 00:09:22.905 "write_latency_ticks": 0, 00:09:22.905 "max_write_latency_ticks": 0, 00:09:22.905 "min_write_latency_ticks": 0, 00:09:22.905 "unmap_latency_ticks": 0, 00:09:22.905 "max_unmap_latency_ticks": 0, 00:09:22.905 "min_unmap_latency_ticks": 0, 00:09:22.905 "copy_latency_ticks": 0, 00:09:22.905 "max_copy_latency_ticks": 0, 00:09:22.905 "min_copy_latency_ticks": 0 00:09:22.905 } 00:09:22.905 ] 00:09:22.905 }' 00:09:22.905 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # jq -r '.channels[0].num_read_ops' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@572 -- # io_count_per_channel1=89344 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@573 -- # io_count_per_channel_all=89344 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # jq -r '.channels[1].num_read_ops' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@574 -- # io_count_per_channel2=90624 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@575 -- # io_count_per_channel_all=179968 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # rpc_cmd bdev_get_iostat -b Malloc_STAT 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@577 -- # iostats='{ 00:09:23.165 "tick_rate": 2100000000, 00:09:23.165 "ticks": 3856053339376496, 00:09:23.165 "bdevs": [ 00:09:23.165 { 00:09:23.165 "name": "Malloc_STAT", 00:09:23.165 "bytes_read": 779137536, 00:09:23.165 "num_read_ops": 190212, 00:09:23.165 "bytes_written": 0, 00:09:23.165 "num_write_ops": 0, 00:09:23.165 "bytes_unmapped": 0, 00:09:23.165 "num_unmap_ops": 0, 00:09:23.165 "bytes_copied": 0, 00:09:23.165 "num_copy_ops": 0, 00:09:23.165 "read_latency_ticks": 2225165796306, 00:09:23.165 "max_read_latency_ticks": 13623692, 00:09:23.165 "min_read_latency_ticks": 229912, 00:09:23.165 "write_latency_ticks": 0, 00:09:23.165 "max_write_latency_ticks": 0, 00:09:23.165 "min_write_latency_ticks": 0, 00:09:23.165 "unmap_latency_ticks": 0, 00:09:23.165 "max_unmap_latency_ticks": 0, 00:09:23.165 "min_unmap_latency_ticks": 0, 00:09:23.165 "copy_latency_ticks": 0, 00:09:23.165 "max_copy_latency_ticks": 0, 00:09:23.165 "min_copy_latency_ticks": 0, 00:09:23.165 "io_error": {} 00:09:23.165 } 00:09:23.165 ] 00:09:23.165 }' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # jq -r '.bdevs[0].num_read_ops' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@578 -- # io_count2=190212 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -lt 173828 ']' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@583 -- # '[' 179968 -gt 190212 ']' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@608 -- # rpc_cmd bdev_malloc_delete Malloc_STAT 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@560 -- # xtrace_disable 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.165 00:09:23.165 Latency(us) 00:09:23.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:23.165 Job: Malloc_STAT (Core Mask 0x1, workload: randread, depth: 256, IO size: 4096) 00:09:23.165 Malloc_STAT : 2.15 45580.10 178.05 0.00 0.00 5602.63 1505.77 6147.90 00:09:23.165 Job: Malloc_STAT (Core Mask 0x2, workload: randread, depth: 256, IO size: 4096) 00:09:23.165 Malloc_STAT : 2.15 46240.22 180.63 0.00 0.00 5523.26 1045.46 6491.18 00:09:23.165 =================================================================================================================== 00:09:23.165 Total : 91820.32 358.67 0.00 0.00 5562.64 1045.46 6491.18 00:09:23.165 0 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@609 -- # killprocess 2627942 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@949 -- # '[' -z 2627942 ']' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@953 -- # kill -0 2627942 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # uname 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2627942 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2627942' 00:09:23.165 killing process with pid 2627942 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@968 -- # kill 2627942 00:09:23.165 Received shutdown signal, test time was about 2.223730 seconds 00:09:23.165 00:09:23.165 Latency(us) 00:09:23.165 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:09:23.165 =================================================================================================================== 00:09:23.165 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:09:23.165 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@973 -- # wait 2627942 00:09:23.424 15:48:28 blockdev_general.bdev_stat -- bdev/blockdev.sh@610 -- # trap - SIGINT SIGTERM EXIT 00:09:23.424 00:09:23.424 real 0m3.622s 00:09:23.424 user 0m7.394s 00:09:23.424 sys 0m0.365s 00:09:23.424 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:23.424 15:48:28 blockdev_general.bdev_stat -- common/autotest_common.sh@10 -- # set +x 00:09:23.424 ************************************ 00:09:23.424 END TEST bdev_stat 00:09:23.424 ************************************ 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@794 -- # [[ bdev == gpt ]] 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@798 -- # [[ bdev == crypto_sw ]] 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@811 -- # cleanup 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@26 -- # [[ bdev == rbd ]] 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@30 -- # [[ bdev == daos ]] 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@34 -- # [[ bdev = \g\p\t ]] 00:09:23.424 15:48:28 blockdev_general -- bdev/blockdev.sh@40 -- # [[ bdev == xnvme ]] 00:09:23.424 00:09:23.424 real 1m54.097s 00:09:23.424 user 7m22.297s 00:09:23.424 sys 0m17.914s 00:09:23.424 15:48:28 blockdev_general -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:23.424 15:48:28 blockdev_general -- common/autotest_common.sh@10 -- # set +x 00:09:23.424 ************************************ 00:09:23.424 END TEST blockdev_general 00:09:23.424 ************************************ 00:09:23.424 15:48:28 -- spdk/autotest.sh@190 -- # run_test bdev_raid /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:23.424 15:48:28 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:09:23.424 15:48:28 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:23.424 15:48:28 -- common/autotest_common.sh@10 -- # set +x 00:09:23.683 ************************************ 00:09:23.683 START TEST bdev_raid 00:09:23.683 ************************************ 00:09:23.683 15:48:28 bdev_raid -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh 00:09:23.683 * Looking for test storage... 00:09:23.683 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:09:23.683 15:48:29 bdev_raid -- bdev/nbd_common.sh@6 -- # set -e 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@15 -- # rpc_py='/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock' 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@851 -- # mkdir -p /raidtest 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@852 -- # trap 'cleanup; exit 1' EXIT 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@854 -- # base_blocklen=512 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@856 -- # uname -s 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@856 -- # '[' Linux = Linux ']' 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@856 -- # modprobe -n nbd 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@857 -- # has_nbd=true 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@858 -- # modprobe nbd 00:09:23.683 15:48:29 bdev_raid -- bdev/bdev_raid.sh@859 -- # run_test raid_function_test_raid0 raid_function_test raid0 00:09:23.683 15:48:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:23.683 15:48:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:23.683 15:48:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:23.683 ************************************ 00:09:23.683 START TEST raid_function_test_raid0 00:09:23.683 ************************************ 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1124 -- # raid_function_test raid0 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@80 -- # local raid_level=raid0 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@85 -- # raid_pid=2628621 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2628621' 00:09:23.683 Process raid pid: 2628621 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@87 -- # waitforlisten 2628621 /var/tmp/spdk-raid.sock 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@830 -- # '[' -z 2628621 ']' 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:23.683 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:23.683 15:48:29 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:23.683 [2024-06-10 15:48:29.158404] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:23.683 [2024-06-10 15:48:29.158458] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:23.942 [2024-06-10 15:48:29.251181] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:23.942 [2024-06-10 15:48:29.343596] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:23.942 [2024-06-10 15:48:29.414433] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:23.942 [2024-06-10 15:48:29.414459] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@863 -- # return 0 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev raid0 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@66 -- # local raid_level=raid0 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@69 -- # cat 00:09:24.880 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:24.880 [2024-06-10 15:48:30.383752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:24.880 [2024-06-10 15:48:30.385241] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:24.880 [2024-06-10 15:48:30.385301] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1f32340 00:09:24.881 [2024-06-10 15:48:30.385310] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:24.881 [2024-06-10 15:48:30.385511] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f48dc0 00:09:24.881 [2024-06-10 15:48:30.385634] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1f32340 00:09:24.881 [2024-06-10 15:48:30.385642] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1f32340 00:09:24.881 [2024-06-10 15:48:30.385748] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:24.881 Base_1 00:09:24.881 Base_2 00:09:25.139 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:25.139 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:25.139 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@12 -- # local i 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:25.398 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:25.398 [2024-06-10 15:48:30.905157] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1f29fe0 00:09:25.657 /dev/nbd0 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@868 -- # local i 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@872 -- # break 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:25.657 1+0 records in 00:09:25.657 1+0 records out 00:09:25.657 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000259601 s, 15.8 MB/s 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@885 -- # size=4096 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@888 -- # return 0 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.657 15:48:30 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:25.917 { 00:09:25.917 "nbd_device": "/dev/nbd0", 00:09:25.917 "bdev_name": "raid" 00:09:25.917 } 00:09:25.917 ]' 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:25.917 { 00:09:25.917 "nbd_device": "/dev/nbd0", 00:09:25.917 "bdev_name": "raid" 00:09:25.917 } 00:09:25.917 ]' 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=1 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 1 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@97 -- # count=1 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:25.917 4096+0 records in 00:09:25.917 4096+0 records out 00:09:25.917 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0231385 s, 90.6 MB/s 00:09:25.917 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:26.176 4096+0 records in 00:09:26.176 4096+0 records out 00:09:26.176 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.245567 s, 8.5 MB/s 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:26.176 128+0 records in 00:09:26.176 128+0 records out 00:09:26.176 65536 bytes (66 kB, 64 KiB) copied, 0.00035751 s, 183 MB/s 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:26.176 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:26.176 2035+0 records in 00:09:26.176 2035+0 records out 00:09:26.176 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.0049428 s, 211 MB/s 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:26.177 456+0 records in 00:09:26.177 456+0 records out 00:09:26.177 233472 bytes (233 kB, 228 KiB) copied, 0.00104012 s, 224 MB/s 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@54 -- # return 0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@51 -- # local i 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:26.177 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:26.436 [2024-06-10 15:48:31.892636] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@41 -- # break 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@45 -- # return 0 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:26.436 15:48:31 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # echo '' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # true 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@65 -- # count=0 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/nbd_common.sh@66 -- # echo 0 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@105 -- # count=0 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@110 -- # killprocess 2628621 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@949 -- # '[' -z 2628621 ']' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@953 -- # kill -0 2628621 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # uname 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:26.727 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2628621 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2628621' 00:09:26.985 killing process with pid 2628621 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@968 -- # kill 2628621 00:09:26.985 [2024-06-10 15:48:32.259420] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:26.985 [2024-06-10 15:48:32.259485] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:26.985 [2024-06-10 15:48:32.259529] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:26.985 [2024-06-10 15:48:32.259538] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1f32340 name raid, state offline 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@973 -- # wait 2628621 00:09:26.985 [2024-06-10 15:48:32.275759] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- bdev/bdev_raid.sh@112 -- # return 0 00:09:26.985 00:09:26.985 real 0m3.369s 00:09:26.985 user 0m4.775s 00:09:26.985 sys 0m0.946s 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:26.985 15:48:32 bdev_raid.raid_function_test_raid0 -- common/autotest_common.sh@10 -- # set +x 00:09:26.985 ************************************ 00:09:26.985 END TEST raid_function_test_raid0 00:09:26.985 ************************************ 00:09:27.243 15:48:32 bdev_raid -- bdev/bdev_raid.sh@860 -- # run_test raid_function_test_concat raid_function_test concat 00:09:27.243 15:48:32 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:09:27.243 15:48:32 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:27.243 15:48:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:27.243 ************************************ 00:09:27.243 START TEST raid_function_test_concat 00:09:27.243 ************************************ 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1124 -- # raid_function_test concat 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@80 -- # local raid_level=concat 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@81 -- # local nbd=/dev/nbd0 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@82 -- # local raid_bdev 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@85 -- # raid_pid=2629381 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@86 -- # echo 'Process raid pid: 2629381' 00:09:27.243 Process raid pid: 2629381 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@84 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@87 -- # waitforlisten 2629381 /var/tmp/spdk-raid.sock 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@830 -- # '[' -z 2629381 ']' 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:27.243 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:27.243 15:48:32 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:27.243 [2024-06-10 15:48:32.596659] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:27.243 [2024-06-10 15:48:32.596711] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:27.243 [2024-06-10 15:48:32.695776] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:27.502 [2024-06-10 15:48:32.790463] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:27.502 [2024-06-10 15:48:32.845123] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:27.502 [2024-06-10 15:48:32.845151] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@863 -- # return 0 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@89 -- # configure_raid_bdev concat 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@66 -- # local raid_level=concat 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@67 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@69 -- # cat 00:09:28.070 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@74 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock 00:09:28.328 [2024-06-10 15:48:33.821840] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:28.328 [2024-06-10 15:48:33.823315] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:28.328 [2024-06-10 15:48:33.823370] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c61340 00:09:28.328 [2024-06-10 15:48:33.823379] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:28.328 [2024-06-10 15:48:33.823577] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c77dc0 00:09:28.328 [2024-06-10 15:48:33.823698] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c61340 00:09:28.328 [2024-06-10 15:48:33.823707] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid, raid_bdev 0x1c61340 00:09:28.328 [2024-06-10 15:48:33.823811] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:28.328 Base_1 00:09:28.328 Base_2 00:09:28.587 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@76 -- # rm -rf /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/rpcs.txt 00:09:28.587 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # jq -r '.[0]["name"] | select(.)' 00:09:28.587 15:48:33 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@90 -- # raid_bdev=raid 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@91 -- # '[' raid = '' ']' 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@96 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid /dev/nbd0 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # bdev_list=('raid') 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@10 -- # local bdev_list 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@11 -- # local nbd_list 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@12 -- # local i 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:28.845 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid /dev/nbd0 00:09:28.845 [2024-06-10 15:48:34.347253] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c58fe0 00:09:29.104 /dev/nbd0 00:09:29.104 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:09:29.104 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:09:29.104 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@868 -- # local i 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@872 -- # break 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:09:29.105 1+0 records in 00:09:29.105 1+0 records out 00:09:29.105 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000234723 s, 17.5 MB/s 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@885 -- # size=4096 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@888 -- # return 0 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:29.105 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:09:29.364 { 00:09:29.364 "nbd_device": "/dev/nbd0", 00:09:29.364 "bdev_name": "raid" 00:09:29.364 } 00:09:29.364 ]' 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[ 00:09:29.364 { 00:09:29.364 "nbd_device": "/dev/nbd0", 00:09:29.364 "bdev_name": "raid" 00:09:29.364 } 00:09:29.364 ]' 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name=/dev/nbd0 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo /dev/nbd0 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=1 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 1 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@97 -- # count=1 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@98 -- # '[' 1 -ne 1 ']' 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@102 -- # raid_unmap_data_verify /dev/nbd0 /var/tmp/spdk-raid.sock 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@18 -- # hash blkdiscard 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@19 -- # local nbd=/dev/nbd0 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@20 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@21 -- # local blksize 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # lsblk -o LOG-SEC /dev/nbd0 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # grep -v LOG-SEC 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # cut -d ' ' -f 5 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@22 -- # blksize=512 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@23 -- # local rw_blk_num=4096 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@24 -- # local rw_len=2097152 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # unmap_blk_offs=('0' '1028' '321') 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@25 -- # local unmap_blk_offs 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # unmap_blk_nums=('128' '2035' '456') 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@26 -- # local unmap_blk_nums 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@27 -- # local unmap_off 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@28 -- # local unmap_len 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@31 -- # dd if=/dev/urandom of=/raidtest/raidrandtest bs=512 count=4096 00:09:29.364 4096+0 records in 00:09:29.364 4096+0 records out 00:09:29.364 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.0251128 s, 83.5 MB/s 00:09:29.364 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@32 -- # dd if=/raidtest/raidrandtest of=/dev/nbd0 bs=512 count=4096 oflag=direct 00:09:29.623 4096+0 records in 00:09:29.623 4096+0 records out 00:09:29.623 2097152 bytes (2.1 MB, 2.0 MiB) copied, 0.220368 s, 9.5 MB/s 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@33 -- # blockdev --flushbufs /dev/nbd0 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@36 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i = 0 )) 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=0 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=65536 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=0 count=128 conv=notrunc 00:09:29.623 128+0 records in 00:09:29.623 128+0 records out 00:09:29.623 65536 bytes (66 kB, 64 KiB) copied, 0.000367993 s, 178 MB/s 00:09:29.623 15:48:34 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 0 -l 65536 /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=526336 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=1041920 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=1028 count=2035 conv=notrunc 00:09:29.623 2035+0 records in 00:09:29.623 2035+0 records out 00:09:29.623 1041920 bytes (1.0 MB, 1018 KiB) copied, 0.00481673 s, 216 MB/s 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 526336 -l 1041920 /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:29.623 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@39 -- # unmap_off=164352 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@40 -- # unmap_len=233472 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@43 -- # dd if=/dev/zero of=/raidtest/raidrandtest bs=512 seek=321 count=456 conv=notrunc 00:09:29.624 456+0 records in 00:09:29.624 456+0 records out 00:09:29.624 233472 bytes (233 kB, 228 KiB) copied, 0.00114527 s, 204 MB/s 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@46 -- # blkdiscard -o 164352 -l 233472 /dev/nbd0 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@47 -- # blockdev --flushbufs /dev/nbd0 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@50 -- # cmp -b -n 2097152 /raidtest/raidrandtest /dev/nbd0 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i++ )) 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@38 -- # (( i < 3 )) 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@54 -- # return 0 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@104 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@50 -- # local nbd_list 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@51 -- # local i 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:09:29.624 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:09:29.883 [2024-06-10 15:48:35.234758] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@41 -- # break 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@45 -- # return 0 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # nbd_get_count /var/tmp/spdk-raid.sock 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:09:29.883 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_get_disks 00:09:30.141 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # echo '[]' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # echo '' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # true 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@65 -- # count=0 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/nbd_common.sh@66 -- # echo 0 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@105 -- # count=0 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@106 -- # '[' 0 -ne 0 ']' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@110 -- # killprocess 2629381 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@949 -- # '[' -z 2629381 ']' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@953 -- # kill -0 2629381 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # uname 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2629381 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2629381' 00:09:30.142 killing process with pid 2629381 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@968 -- # kill 2629381 00:09:30.142 [2024-06-10 15:48:35.601750] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:30.142 [2024-06-10 15:48:35.601813] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:30.142 [2024-06-10 15:48:35.601855] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:30.142 [2024-06-10 15:48:35.601864] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c61340 name raid, state offline 00:09:30.142 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@973 -- # wait 2629381 00:09:30.142 [2024-06-10 15:48:35.618105] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:30.401 15:48:35 bdev_raid.raid_function_test_concat -- bdev/bdev_raid.sh@112 -- # return 0 00:09:30.401 00:09:30.401 real 0m3.274s 00:09:30.401 user 0m4.655s 00:09:30.401 sys 0m0.902s 00:09:30.401 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:30.401 15:48:35 bdev_raid.raid_function_test_concat -- common/autotest_common.sh@10 -- # set +x 00:09:30.401 ************************************ 00:09:30.401 END TEST raid_function_test_concat 00:09:30.401 ************************************ 00:09:30.401 15:48:35 bdev_raid -- bdev/bdev_raid.sh@863 -- # run_test raid0_resize_test raid0_resize_test 00:09:30.401 15:48:35 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:09:30.401 15:48:35 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:30.401 15:48:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:30.401 ************************************ 00:09:30.401 START TEST raid0_resize_test 00:09:30.401 ************************************ 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1124 -- # raid0_resize_test 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@347 -- # local blksize=512 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@348 -- # local bdev_size_mb=32 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@349 -- # local new_bdev_size_mb=64 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@350 -- # local blkcnt 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@351 -- # local raid_size_mb 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@352 -- # local new_raid_size_mb 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@355 -- # raid_pid=2629928 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@356 -- # echo 'Process raid pid: 2629928' 00:09:30.401 Process raid pid: 2629928 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@354 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@357 -- # waitforlisten 2629928 /var/tmp/spdk-raid.sock 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@830 -- # '[' -z 2629928 ']' 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:30.401 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:30.401 15:48:35 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:30.660 [2024-06-10 15:48:35.940709] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:30.660 [2024-06-10 15:48:35.940762] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:30.660 [2024-06-10 15:48:36.039556] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:30.660 [2024-06-10 15:48:36.134372] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:30.920 [2024-06-10 15:48:36.201030] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:30.920 [2024-06-10 15:48:36.201055] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:31.487 15:48:36 bdev_raid.raid0_resize_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:31.487 15:48:36 bdev_raid.raid0_resize_test -- common/autotest_common.sh@863 -- # return 0 00:09:31.487 15:48:36 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@359 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_1 32 512 00:09:31.746 Base_1 00:09:31.746 15:48:37 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@360 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_create Base_2 32 512 00:09:32.004 Base_2 00:09:32.004 15:48:37 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@362 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r 0 -b 'Base_1 Base_2' -n Raid 00:09:32.263 [2024-06-10 15:48:37.617358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_1 is claimed 00:09:32.263 [2024-06-10 15:48:37.618836] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev Base_2 is claimed 00:09:32.263 [2024-06-10 15:48:37.618881] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x11012c0 00:09:32.263 [2024-06-10 15:48:37.618890] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:32.263 [2024-06-10 15:48:37.619097] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12a4be0 00:09:32.263 [2024-06-10 15:48:37.619190] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x11012c0 00:09:32.263 [2024-06-10 15:48:37.619198] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Raid, raid_bdev 0x11012c0 00:09:32.263 [2024-06-10 15:48:37.619304] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:32.263 15:48:37 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@365 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_1 64 00:09:32.522 [2024-06-10 15:48:37.870023] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:32.522 [2024-06-10 15:48:37.870042] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_1' was resized: old size 65536, new size 131072 00:09:32.522 true 00:09:32.522 15:48:37 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:32.522 15:48:37 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # jq '.[].num_blocks' 00:09:32.781 [2024-06-10 15:48:38.122824] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:32.781 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@368 -- # blkcnt=131072 00:09:32.781 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@369 -- # raid_size_mb=64 00:09:32.781 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@370 -- # '[' 64 '!=' 64 ']' 00:09:32.781 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@376 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_null_resize Base_2 64 00:09:33.039 [2024-06-10 15:48:38.375350] bdev_raid.c:2262:raid_bdev_resize_base_bdev: *DEBUG*: raid_bdev_resize_base_bdev 00:09:33.039 [2024-06-10 15:48:38.375366] bdev_raid.c:2275:raid_bdev_resize_base_bdev: *NOTICE*: base_bdev 'Base_2' was resized: old size 65536, new size 131072 00:09:33.039 [2024-06-10 15:48:38.375386] bdev_raid.c:2289:raid_bdev_resize_base_bdev: *NOTICE*: raid bdev 'Raid': block count was changed from 131072 to 262144 00:09:33.039 true 00:09:33.039 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Raid 00:09:33.039 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # jq '.[].num_blocks' 00:09:33.298 [2024-06-10 15:48:38.628156] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@379 -- # blkcnt=262144 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@380 -- # raid_size_mb=128 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@381 -- # '[' 128 '!=' 128 ']' 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@386 -- # killprocess 2629928 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@949 -- # '[' -z 2629928 ']' 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@953 -- # kill -0 2629928 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # uname 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2629928 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:33.298 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:33.299 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2629928' 00:09:33.299 killing process with pid 2629928 00:09:33.299 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@968 -- # kill 2629928 00:09:33.299 [2024-06-10 15:48:38.692792] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:33.299 [2024-06-10 15:48:38.692842] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:33.299 [2024-06-10 15:48:38.692881] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:09:33.299 [2024-06-10 15:48:38.692890] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x11012c0 name Raid, state offline 00:09:33.299 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@973 -- # wait 2629928 00:09:33.299 [2024-06-10 15:48:38.694130] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:33.558 15:48:38 bdev_raid.raid0_resize_test -- bdev/bdev_raid.sh@388 -- # return 0 00:09:33.558 00:09:33.558 real 0m2.993s 00:09:33.558 user 0m4.745s 00:09:33.558 sys 0m0.535s 00:09:33.558 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:33.558 15:48:38 bdev_raid.raid0_resize_test -- common/autotest_common.sh@10 -- # set +x 00:09:33.558 ************************************ 00:09:33.558 END TEST raid0_resize_test 00:09:33.558 ************************************ 00:09:33.558 15:48:38 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:09:33.558 15:48:38 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:09:33.558 15:48:38 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 2 false 00:09:33.558 15:48:38 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:33.558 15:48:38 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:33.558 15:48:38 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:33.558 ************************************ 00:09:33.558 START TEST raid_state_function_test 00:09:33.558 ************************************ 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 false 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2630499 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2630499' 00:09:33.558 Process raid pid: 2630499 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2630499 /var/tmp/spdk-raid.sock 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2630499 ']' 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:33.558 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:33.558 15:48:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:33.558 [2024-06-10 15:48:39.012088] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:33.558 [2024-06-10 15:48:39.012142] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:33.818 [2024-06-10 15:48:39.110420] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:33.818 [2024-06-10 15:48:39.204912] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:33.818 [2024-06-10 15:48:39.263407] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:33.818 [2024-06-10 15:48:39.263438] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:34.755 15:48:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:34.755 15:48:39 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:09:34.755 15:48:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:34.755 [2024-06-10 15:48:40.195111] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:34.755 [2024-06-10 15:48:40.195153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:34.755 [2024-06-10 15:48:40.195162] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:34.755 [2024-06-10 15:48:40.195171] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:34.755 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:35.013 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:35.013 "name": "Existed_Raid", 00:09:35.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.013 "strip_size_kb": 64, 00:09:35.013 "state": "configuring", 00:09:35.013 "raid_level": "raid0", 00:09:35.013 "superblock": false, 00:09:35.013 "num_base_bdevs": 2, 00:09:35.013 "num_base_bdevs_discovered": 0, 00:09:35.013 "num_base_bdevs_operational": 2, 00:09:35.013 "base_bdevs_list": [ 00:09:35.013 { 00:09:35.013 "name": "BaseBdev1", 00:09:35.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.013 "is_configured": false, 00:09:35.013 "data_offset": 0, 00:09:35.013 "data_size": 0 00:09:35.013 }, 00:09:35.013 { 00:09:35.013 "name": "BaseBdev2", 00:09:35.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:35.013 "is_configured": false, 00:09:35.013 "data_offset": 0, 00:09:35.013 "data_size": 0 00:09:35.013 } 00:09:35.013 ] 00:09:35.013 }' 00:09:35.013 15:48:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:35.013 15:48:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:35.949 15:48:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:35.949 [2024-06-10 15:48:41.330092] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:35.949 [2024-06-10 15:48:41.330119] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd62120 name Existed_Raid, state configuring 00:09:35.949 15:48:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:36.207 [2024-06-10 15:48:41.590796] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:36.207 [2024-06-10 15:48:41.590820] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:36.207 [2024-06-10 15:48:41.590829] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:36.207 [2024-06-10 15:48:41.590839] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:36.207 15:48:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:36.466 [2024-06-10 15:48:41.852931] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:36.466 BaseBdev1 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:36.466 15:48:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:36.737 15:48:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:37.001 [ 00:09:37.001 { 00:09:37.001 "name": "BaseBdev1", 00:09:37.001 "aliases": [ 00:09:37.001 "7c018152-b318-4bb2-a356-895dbdf1d7ea" 00:09:37.001 ], 00:09:37.001 "product_name": "Malloc disk", 00:09:37.001 "block_size": 512, 00:09:37.001 "num_blocks": 65536, 00:09:37.001 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:37.001 "assigned_rate_limits": { 00:09:37.001 "rw_ios_per_sec": 0, 00:09:37.001 "rw_mbytes_per_sec": 0, 00:09:37.001 "r_mbytes_per_sec": 0, 00:09:37.001 "w_mbytes_per_sec": 0 00:09:37.001 }, 00:09:37.001 "claimed": true, 00:09:37.001 "claim_type": "exclusive_write", 00:09:37.001 "zoned": false, 00:09:37.001 "supported_io_types": { 00:09:37.001 "read": true, 00:09:37.001 "write": true, 00:09:37.001 "unmap": true, 00:09:37.001 "write_zeroes": true, 00:09:37.001 "flush": true, 00:09:37.001 "reset": true, 00:09:37.001 "compare": false, 00:09:37.001 "compare_and_write": false, 00:09:37.001 "abort": true, 00:09:37.001 "nvme_admin": false, 00:09:37.001 "nvme_io": false 00:09:37.001 }, 00:09:37.001 "memory_domains": [ 00:09:37.001 { 00:09:37.001 "dma_device_id": "system", 00:09:37.001 "dma_device_type": 1 00:09:37.001 }, 00:09:37.001 { 00:09:37.001 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:37.001 "dma_device_type": 2 00:09:37.001 } 00:09:37.001 ], 00:09:37.001 "driver_specific": {} 00:09:37.001 } 00:09:37.001 ] 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:37.001 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:37.002 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:37.002 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:37.260 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:37.260 "name": "Existed_Raid", 00:09:37.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.260 "strip_size_kb": 64, 00:09:37.260 "state": "configuring", 00:09:37.260 "raid_level": "raid0", 00:09:37.260 "superblock": false, 00:09:37.260 "num_base_bdevs": 2, 00:09:37.260 "num_base_bdevs_discovered": 1, 00:09:37.260 "num_base_bdevs_operational": 2, 00:09:37.260 "base_bdevs_list": [ 00:09:37.260 { 00:09:37.260 "name": "BaseBdev1", 00:09:37.260 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:37.260 "is_configured": true, 00:09:37.260 "data_offset": 0, 00:09:37.260 "data_size": 65536 00:09:37.260 }, 00:09:37.260 { 00:09:37.260 "name": "BaseBdev2", 00:09:37.260 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:37.260 "is_configured": false, 00:09:37.260 "data_offset": 0, 00:09:37.260 "data_size": 0 00:09:37.260 } 00:09:37.260 ] 00:09:37.260 }' 00:09:37.260 15:48:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:37.260 15:48:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:37.825 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:38.083 [2024-06-10 15:48:43.509385] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:38.083 [2024-06-10 15:48:43.509421] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd619f0 name Existed_Raid, state configuring 00:09:38.083 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:38.341 [2024-06-10 15:48:43.766099] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:38.341 [2024-06-10 15:48:43.767619] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:38.341 [2024-06-10 15:48:43.767648] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:38.341 15:48:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:38.600 15:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:38.600 "name": "Existed_Raid", 00:09:38.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.600 "strip_size_kb": 64, 00:09:38.600 "state": "configuring", 00:09:38.600 "raid_level": "raid0", 00:09:38.600 "superblock": false, 00:09:38.600 "num_base_bdevs": 2, 00:09:38.600 "num_base_bdevs_discovered": 1, 00:09:38.600 "num_base_bdevs_operational": 2, 00:09:38.600 "base_bdevs_list": [ 00:09:38.600 { 00:09:38.600 "name": "BaseBdev1", 00:09:38.600 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:38.600 "is_configured": true, 00:09:38.600 "data_offset": 0, 00:09:38.600 "data_size": 65536 00:09:38.600 }, 00:09:38.600 { 00:09:38.600 "name": "BaseBdev2", 00:09:38.600 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:38.600 "is_configured": false, 00:09:38.600 "data_offset": 0, 00:09:38.600 "data_size": 0 00:09:38.600 } 00:09:38.600 ] 00:09:38.600 }' 00:09:38.600 15:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:38.600 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:39.166 15:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:39.425 [2024-06-10 15:48:44.912378] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:39.425 [2024-06-10 15:48:44.912425] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xd62770 00:09:39.425 [2024-06-10 15:48:44.912432] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:09:39.425 [2024-06-10 15:48:44.912634] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd63eb0 00:09:39.425 [2024-06-10 15:48:44.912757] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xd62770 00:09:39.425 [2024-06-10 15:48:44.912766] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xd62770 00:09:39.425 [2024-06-10 15:48:44.912928] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:39.425 BaseBdev2 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:39.425 15:48:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:39.683 15:48:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:39.941 [ 00:09:39.941 { 00:09:39.941 "name": "BaseBdev2", 00:09:39.941 "aliases": [ 00:09:39.941 "a69735a5-6863-4ec0-b57f-4849dce70739" 00:09:39.941 ], 00:09:39.941 "product_name": "Malloc disk", 00:09:39.941 "block_size": 512, 00:09:39.941 "num_blocks": 65536, 00:09:39.941 "uuid": "a69735a5-6863-4ec0-b57f-4849dce70739", 00:09:39.941 "assigned_rate_limits": { 00:09:39.941 "rw_ios_per_sec": 0, 00:09:39.941 "rw_mbytes_per_sec": 0, 00:09:39.941 "r_mbytes_per_sec": 0, 00:09:39.941 "w_mbytes_per_sec": 0 00:09:39.941 }, 00:09:39.941 "claimed": true, 00:09:39.941 "claim_type": "exclusive_write", 00:09:39.941 "zoned": false, 00:09:39.941 "supported_io_types": { 00:09:39.941 "read": true, 00:09:39.941 "write": true, 00:09:39.941 "unmap": true, 00:09:39.941 "write_zeroes": true, 00:09:39.941 "flush": true, 00:09:39.941 "reset": true, 00:09:39.941 "compare": false, 00:09:39.941 "compare_and_write": false, 00:09:39.941 "abort": true, 00:09:39.941 "nvme_admin": false, 00:09:39.941 "nvme_io": false 00:09:39.941 }, 00:09:39.941 "memory_domains": [ 00:09:39.941 { 00:09:39.941 "dma_device_id": "system", 00:09:39.941 "dma_device_type": 1 00:09:39.941 }, 00:09:39.941 { 00:09:39.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:39.942 "dma_device_type": 2 00:09:39.942 } 00:09:39.942 ], 00:09:39.942 "driver_specific": {} 00:09:39.942 } 00:09:39.942 ] 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:39.942 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:40.200 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:40.200 "name": "Existed_Raid", 00:09:40.200 "uuid": "c6a2f324-2430-4cc3-b531-22544aa1fcdd", 00:09:40.200 "strip_size_kb": 64, 00:09:40.200 "state": "online", 00:09:40.200 "raid_level": "raid0", 00:09:40.200 "superblock": false, 00:09:40.200 "num_base_bdevs": 2, 00:09:40.200 "num_base_bdevs_discovered": 2, 00:09:40.200 "num_base_bdevs_operational": 2, 00:09:40.200 "base_bdevs_list": [ 00:09:40.200 { 00:09:40.200 "name": "BaseBdev1", 00:09:40.200 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:40.200 "is_configured": true, 00:09:40.200 "data_offset": 0, 00:09:40.200 "data_size": 65536 00:09:40.200 }, 00:09:40.200 { 00:09:40.200 "name": "BaseBdev2", 00:09:40.200 "uuid": "a69735a5-6863-4ec0-b57f-4849dce70739", 00:09:40.200 "is_configured": true, 00:09:40.200 "data_offset": 0, 00:09:40.200 "data_size": 65536 00:09:40.200 } 00:09:40.200 ] 00:09:40.200 }' 00:09:40.200 15:48:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:40.200 15:48:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:41.135 [2024-06-10 15:48:46.553047] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:41.135 "name": "Existed_Raid", 00:09:41.135 "aliases": [ 00:09:41.135 "c6a2f324-2430-4cc3-b531-22544aa1fcdd" 00:09:41.135 ], 00:09:41.135 "product_name": "Raid Volume", 00:09:41.135 "block_size": 512, 00:09:41.135 "num_blocks": 131072, 00:09:41.135 "uuid": "c6a2f324-2430-4cc3-b531-22544aa1fcdd", 00:09:41.135 "assigned_rate_limits": { 00:09:41.135 "rw_ios_per_sec": 0, 00:09:41.135 "rw_mbytes_per_sec": 0, 00:09:41.135 "r_mbytes_per_sec": 0, 00:09:41.135 "w_mbytes_per_sec": 0 00:09:41.135 }, 00:09:41.135 "claimed": false, 00:09:41.135 "zoned": false, 00:09:41.135 "supported_io_types": { 00:09:41.135 "read": true, 00:09:41.135 "write": true, 00:09:41.135 "unmap": true, 00:09:41.135 "write_zeroes": true, 00:09:41.135 "flush": true, 00:09:41.135 "reset": true, 00:09:41.135 "compare": false, 00:09:41.135 "compare_and_write": false, 00:09:41.135 "abort": false, 00:09:41.135 "nvme_admin": false, 00:09:41.135 "nvme_io": false 00:09:41.135 }, 00:09:41.135 "memory_domains": [ 00:09:41.135 { 00:09:41.135 "dma_device_id": "system", 00:09:41.135 "dma_device_type": 1 00:09:41.135 }, 00:09:41.135 { 00:09:41.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.135 "dma_device_type": 2 00:09:41.135 }, 00:09:41.135 { 00:09:41.135 "dma_device_id": "system", 00:09:41.135 "dma_device_type": 1 00:09:41.135 }, 00:09:41.135 { 00:09:41.135 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.135 "dma_device_type": 2 00:09:41.135 } 00:09:41.135 ], 00:09:41.135 "driver_specific": { 00:09:41.135 "raid": { 00:09:41.135 "uuid": "c6a2f324-2430-4cc3-b531-22544aa1fcdd", 00:09:41.135 "strip_size_kb": 64, 00:09:41.135 "state": "online", 00:09:41.135 "raid_level": "raid0", 00:09:41.135 "superblock": false, 00:09:41.135 "num_base_bdevs": 2, 00:09:41.135 "num_base_bdevs_discovered": 2, 00:09:41.135 "num_base_bdevs_operational": 2, 00:09:41.135 "base_bdevs_list": [ 00:09:41.135 { 00:09:41.135 "name": "BaseBdev1", 00:09:41.135 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:41.135 "is_configured": true, 00:09:41.135 "data_offset": 0, 00:09:41.135 "data_size": 65536 00:09:41.135 }, 00:09:41.135 { 00:09:41.135 "name": "BaseBdev2", 00:09:41.135 "uuid": "a69735a5-6863-4ec0-b57f-4849dce70739", 00:09:41.135 "is_configured": true, 00:09:41.135 "data_offset": 0, 00:09:41.135 "data_size": 65536 00:09:41.135 } 00:09:41.135 ] 00:09:41.135 } 00:09:41.135 } 00:09:41.135 }' 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:41.135 BaseBdev2' 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:41.135 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:41.393 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:41.393 "name": "BaseBdev1", 00:09:41.393 "aliases": [ 00:09:41.393 "7c018152-b318-4bb2-a356-895dbdf1d7ea" 00:09:41.393 ], 00:09:41.393 "product_name": "Malloc disk", 00:09:41.393 "block_size": 512, 00:09:41.393 "num_blocks": 65536, 00:09:41.393 "uuid": "7c018152-b318-4bb2-a356-895dbdf1d7ea", 00:09:41.393 "assigned_rate_limits": { 00:09:41.393 "rw_ios_per_sec": 0, 00:09:41.393 "rw_mbytes_per_sec": 0, 00:09:41.393 "r_mbytes_per_sec": 0, 00:09:41.393 "w_mbytes_per_sec": 0 00:09:41.393 }, 00:09:41.393 "claimed": true, 00:09:41.393 "claim_type": "exclusive_write", 00:09:41.393 "zoned": false, 00:09:41.393 "supported_io_types": { 00:09:41.393 "read": true, 00:09:41.393 "write": true, 00:09:41.393 "unmap": true, 00:09:41.393 "write_zeroes": true, 00:09:41.393 "flush": true, 00:09:41.393 "reset": true, 00:09:41.393 "compare": false, 00:09:41.393 "compare_and_write": false, 00:09:41.393 "abort": true, 00:09:41.393 "nvme_admin": false, 00:09:41.393 "nvme_io": false 00:09:41.393 }, 00:09:41.393 "memory_domains": [ 00:09:41.393 { 00:09:41.393 "dma_device_id": "system", 00:09:41.393 "dma_device_type": 1 00:09:41.393 }, 00:09:41.393 { 00:09:41.393 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:41.393 "dma_device_type": 2 00:09:41.393 } 00:09:41.393 ], 00:09:41.393 "driver_specific": {} 00:09:41.393 }' 00:09:41.393 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:41.652 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:41.652 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:41.652 15:48:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:41.652 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:41.652 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:41.652 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:41.652 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:41.910 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:42.169 "name": "BaseBdev2", 00:09:42.169 "aliases": [ 00:09:42.169 "a69735a5-6863-4ec0-b57f-4849dce70739" 00:09:42.169 ], 00:09:42.169 "product_name": "Malloc disk", 00:09:42.169 "block_size": 512, 00:09:42.169 "num_blocks": 65536, 00:09:42.169 "uuid": "a69735a5-6863-4ec0-b57f-4849dce70739", 00:09:42.169 "assigned_rate_limits": { 00:09:42.169 "rw_ios_per_sec": 0, 00:09:42.169 "rw_mbytes_per_sec": 0, 00:09:42.169 "r_mbytes_per_sec": 0, 00:09:42.169 "w_mbytes_per_sec": 0 00:09:42.169 }, 00:09:42.169 "claimed": true, 00:09:42.169 "claim_type": "exclusive_write", 00:09:42.169 "zoned": false, 00:09:42.169 "supported_io_types": { 00:09:42.169 "read": true, 00:09:42.169 "write": true, 00:09:42.169 "unmap": true, 00:09:42.169 "write_zeroes": true, 00:09:42.169 "flush": true, 00:09:42.169 "reset": true, 00:09:42.169 "compare": false, 00:09:42.169 "compare_and_write": false, 00:09:42.169 "abort": true, 00:09:42.169 "nvme_admin": false, 00:09:42.169 "nvme_io": false 00:09:42.169 }, 00:09:42.169 "memory_domains": [ 00:09:42.169 { 00:09:42.169 "dma_device_id": "system", 00:09:42.169 "dma_device_type": 1 00:09:42.169 }, 00:09:42.169 { 00:09:42.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:42.169 "dma_device_type": 2 00:09:42.169 } 00:09:42.169 ], 00:09:42.169 "driver_specific": {} 00:09:42.169 }' 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:42.169 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:42.427 15:48:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:42.686 [2024-06-10 15:48:48.101040] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:42.686 [2024-06-10 15:48:48.101063] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:42.686 [2024-06-10 15:48:48.101104] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:42.686 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:42.943 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:42.943 "name": "Existed_Raid", 00:09:42.943 "uuid": "c6a2f324-2430-4cc3-b531-22544aa1fcdd", 00:09:42.943 "strip_size_kb": 64, 00:09:42.943 "state": "offline", 00:09:42.943 "raid_level": "raid0", 00:09:42.943 "superblock": false, 00:09:42.943 "num_base_bdevs": 2, 00:09:42.943 "num_base_bdevs_discovered": 1, 00:09:42.943 "num_base_bdevs_operational": 1, 00:09:42.943 "base_bdevs_list": [ 00:09:42.943 { 00:09:42.943 "name": null, 00:09:42.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:42.943 "is_configured": false, 00:09:42.943 "data_offset": 0, 00:09:42.943 "data_size": 65536 00:09:42.943 }, 00:09:42.943 { 00:09:42.943 "name": "BaseBdev2", 00:09:42.943 "uuid": "a69735a5-6863-4ec0-b57f-4849dce70739", 00:09:42.943 "is_configured": true, 00:09:42.943 "data_offset": 0, 00:09:42.943 "data_size": 65536 00:09:42.943 } 00:09:42.943 ] 00:09:42.943 }' 00:09:42.944 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:42.944 15:48:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:43.547 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:43.547 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:43.547 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:43.547 15:48:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:43.806 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:43.806 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:43.807 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:44.065 [2024-06-10 15:48:49.481805] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:44.065 [2024-06-10 15:48:49.481851] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xd62770 name Existed_Raid, state offline 00:09:44.065 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:44.065 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:44.065 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:44.065 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2630499 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2630499 ']' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2630499 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2630499 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2630499' 00:09:44.323 killing process with pid 2630499 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2630499 00:09:44.323 [2024-06-10 15:48:49.809877] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:44.323 15:48:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2630499 00:09:44.323 [2024-06-10 15:48:49.810732] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:09:44.581 00:09:44.581 real 0m11.060s 00:09:44.581 user 0m20.158s 00:09:44.581 sys 0m1.621s 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:09:44.581 ************************************ 00:09:44.581 END TEST raid_state_function_test 00:09:44.581 ************************************ 00:09:44.581 15:48:50 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 2 true 00:09:44.581 15:48:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:09:44.581 15:48:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:44.581 15:48:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:44.581 ************************************ 00:09:44.581 START TEST raid_state_function_test_sb 00:09:44.581 ************************************ 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 2 true 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:09:44.581 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2632632 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2632632' 00:09:44.582 Process raid pid: 2632632 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2632632 /var/tmp/spdk-raid.sock 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2632632 ']' 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:44.582 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:44.582 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:44.840 [2024-06-10 15:48:50.135560] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:44.840 [2024-06-10 15:48:50.135616] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:09:44.840 [2024-06-10 15:48:50.235403] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:44.840 [2024-06-10 15:48:50.330261] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:45.099 [2024-06-10 15:48:50.392332] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:45.099 [2024-06-10 15:48:50.392364] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:45.666 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:45.666 15:48:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:09:45.666 15:48:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:45.925 [2024-06-10 15:48:51.223840] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:45.925 [2024-06-10 15:48:51.223879] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:45.925 [2024-06-10 15:48:51.223889] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:45.925 [2024-06-10 15:48:51.223897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:45.925 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:46.184 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:46.184 "name": "Existed_Raid", 00:09:46.184 "uuid": "79fa5641-d9c0-4956-8b2e-50b69cf3fb26", 00:09:46.184 "strip_size_kb": 64, 00:09:46.184 "state": "configuring", 00:09:46.184 "raid_level": "raid0", 00:09:46.184 "superblock": true, 00:09:46.184 "num_base_bdevs": 2, 00:09:46.184 "num_base_bdevs_discovered": 0, 00:09:46.184 "num_base_bdevs_operational": 2, 00:09:46.184 "base_bdevs_list": [ 00:09:46.184 { 00:09:46.184 "name": "BaseBdev1", 00:09:46.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:46.184 "is_configured": false, 00:09:46.184 "data_offset": 0, 00:09:46.184 "data_size": 0 00:09:46.184 }, 00:09:46.184 { 00:09:46.184 "name": "BaseBdev2", 00:09:46.184 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:46.184 "is_configured": false, 00:09:46.184 "data_offset": 0, 00:09:46.184 "data_size": 0 00:09:46.184 } 00:09:46.184 ] 00:09:46.184 }' 00:09:46.184 15:48:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:46.184 15:48:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:46.749 15:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:47.007 [2024-06-10 15:48:52.266626] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:47.007 [2024-06-10 15:48:52.266654] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x122b120 name Existed_Raid, state configuring 00:09:47.007 15:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:47.007 [2024-06-10 15:48:52.439112] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:09:47.007 [2024-06-10 15:48:52.439134] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:09:47.007 [2024-06-10 15:48:52.439142] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:47.007 [2024-06-10 15:48:52.439150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:47.007 15:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:09:47.265 [2024-06-10 15:48:52.629103] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:47.265 BaseBdev1 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:47.265 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:47.523 15:48:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:09:47.782 [ 00:09:47.782 { 00:09:47.782 "name": "BaseBdev1", 00:09:47.782 "aliases": [ 00:09:47.782 "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd" 00:09:47.782 ], 00:09:47.782 "product_name": "Malloc disk", 00:09:47.782 "block_size": 512, 00:09:47.782 "num_blocks": 65536, 00:09:47.782 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:47.782 "assigned_rate_limits": { 00:09:47.782 "rw_ios_per_sec": 0, 00:09:47.782 "rw_mbytes_per_sec": 0, 00:09:47.782 "r_mbytes_per_sec": 0, 00:09:47.782 "w_mbytes_per_sec": 0 00:09:47.782 }, 00:09:47.782 "claimed": true, 00:09:47.782 "claim_type": "exclusive_write", 00:09:47.782 "zoned": false, 00:09:47.782 "supported_io_types": { 00:09:47.782 "read": true, 00:09:47.782 "write": true, 00:09:47.782 "unmap": true, 00:09:47.782 "write_zeroes": true, 00:09:47.782 "flush": true, 00:09:47.782 "reset": true, 00:09:47.782 "compare": false, 00:09:47.782 "compare_and_write": false, 00:09:47.782 "abort": true, 00:09:47.782 "nvme_admin": false, 00:09:47.782 "nvme_io": false 00:09:47.782 }, 00:09:47.782 "memory_domains": [ 00:09:47.782 { 00:09:47.782 "dma_device_id": "system", 00:09:47.782 "dma_device_type": 1 00:09:47.782 }, 00:09:47.782 { 00:09:47.782 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:47.782 "dma_device_type": 2 00:09:47.782 } 00:09:47.782 ], 00:09:47.782 "driver_specific": {} 00:09:47.782 } 00:09:47.782 ] 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:47.782 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:48.041 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:48.041 "name": "Existed_Raid", 00:09:48.041 "uuid": "089c4612-7829-4411-ab21-2a6be472d9aa", 00:09:48.041 "strip_size_kb": 64, 00:09:48.041 "state": "configuring", 00:09:48.041 "raid_level": "raid0", 00:09:48.041 "superblock": true, 00:09:48.041 "num_base_bdevs": 2, 00:09:48.041 "num_base_bdevs_discovered": 1, 00:09:48.041 "num_base_bdevs_operational": 2, 00:09:48.041 "base_bdevs_list": [ 00:09:48.041 { 00:09:48.041 "name": "BaseBdev1", 00:09:48.041 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:48.041 "is_configured": true, 00:09:48.041 "data_offset": 2048, 00:09:48.041 "data_size": 63488 00:09:48.041 }, 00:09:48.041 { 00:09:48.041 "name": "BaseBdev2", 00:09:48.041 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:48.041 "is_configured": false, 00:09:48.041 "data_offset": 0, 00:09:48.041 "data_size": 0 00:09:48.041 } 00:09:48.041 ] 00:09:48.041 }' 00:09:48.041 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:48.041 15:48:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:48.607 15:48:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:09:48.607 [2024-06-10 15:48:54.101034] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:09:48.607 [2024-06-10 15:48:54.101070] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x122a9f0 name Existed_Raid, state configuring 00:09:48.866 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:09:48.866 [2024-06-10 15:48:54.357749] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:09:48.866 [2024-06-10 15:48:54.359273] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:09:48.866 [2024-06-10 15:48:54.359303] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 2 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:49.124 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:49.382 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:49.382 "name": "Existed_Raid", 00:09:49.382 "uuid": "1827518e-5c35-490e-81d6-c0cb284cf6da", 00:09:49.382 "strip_size_kb": 64, 00:09:49.382 "state": "configuring", 00:09:49.382 "raid_level": "raid0", 00:09:49.382 "superblock": true, 00:09:49.382 "num_base_bdevs": 2, 00:09:49.382 "num_base_bdevs_discovered": 1, 00:09:49.382 "num_base_bdevs_operational": 2, 00:09:49.382 "base_bdevs_list": [ 00:09:49.382 { 00:09:49.382 "name": "BaseBdev1", 00:09:49.382 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:49.382 "is_configured": true, 00:09:49.382 "data_offset": 2048, 00:09:49.383 "data_size": 63488 00:09:49.383 }, 00:09:49.383 { 00:09:49.383 "name": "BaseBdev2", 00:09:49.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:49.383 "is_configured": false, 00:09:49.383 "data_offset": 0, 00:09:49.383 "data_size": 0 00:09:49.383 } 00:09:49.383 ] 00:09:49.383 }' 00:09:49.383 15:48:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:49.383 15:48:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:49.949 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:09:50.207 [2024-06-10 15:48:55.475939] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:09:50.207 [2024-06-10 15:48:55.476096] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x122b770 00:09:50.207 [2024-06-10 15:48:55.476108] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:50.208 [2024-06-10 15:48:55.476288] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x122ceb0 00:09:50.208 [2024-06-10 15:48:55.476404] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x122b770 00:09:50.208 [2024-06-10 15:48:55.476412] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x122b770 00:09:50.208 [2024-06-10 15:48:55.476503] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:50.208 BaseBdev2 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:09:50.208 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:09:50.466 [ 00:09:50.466 { 00:09:50.466 "name": "BaseBdev2", 00:09:50.466 "aliases": [ 00:09:50.466 "4888ae60-f018-4ac5-af88-49e8247fbb07" 00:09:50.466 ], 00:09:50.466 "product_name": "Malloc disk", 00:09:50.466 "block_size": 512, 00:09:50.466 "num_blocks": 65536, 00:09:50.466 "uuid": "4888ae60-f018-4ac5-af88-49e8247fbb07", 00:09:50.466 "assigned_rate_limits": { 00:09:50.466 "rw_ios_per_sec": 0, 00:09:50.466 "rw_mbytes_per_sec": 0, 00:09:50.466 "r_mbytes_per_sec": 0, 00:09:50.466 "w_mbytes_per_sec": 0 00:09:50.466 }, 00:09:50.466 "claimed": true, 00:09:50.466 "claim_type": "exclusive_write", 00:09:50.466 "zoned": false, 00:09:50.466 "supported_io_types": { 00:09:50.466 "read": true, 00:09:50.466 "write": true, 00:09:50.466 "unmap": true, 00:09:50.466 "write_zeroes": true, 00:09:50.466 "flush": true, 00:09:50.466 "reset": true, 00:09:50.466 "compare": false, 00:09:50.466 "compare_and_write": false, 00:09:50.466 "abort": true, 00:09:50.466 "nvme_admin": false, 00:09:50.466 "nvme_io": false 00:09:50.466 }, 00:09:50.466 "memory_domains": [ 00:09:50.466 { 00:09:50.466 "dma_device_id": "system", 00:09:50.466 "dma_device_type": 1 00:09:50.466 }, 00:09:50.466 { 00:09:50.466 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:50.466 "dma_device_type": 2 00:09:50.466 } 00:09:50.466 ], 00:09:50.466 "driver_specific": {} 00:09:50.466 } 00:09:50.466 ] 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 2 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:50.466 15:48:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:50.724 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:50.724 "name": "Existed_Raid", 00:09:50.724 "uuid": "1827518e-5c35-490e-81d6-c0cb284cf6da", 00:09:50.724 "strip_size_kb": 64, 00:09:50.724 "state": "online", 00:09:50.724 "raid_level": "raid0", 00:09:50.724 "superblock": true, 00:09:50.724 "num_base_bdevs": 2, 00:09:50.724 "num_base_bdevs_discovered": 2, 00:09:50.724 "num_base_bdevs_operational": 2, 00:09:50.724 "base_bdevs_list": [ 00:09:50.724 { 00:09:50.724 "name": "BaseBdev1", 00:09:50.724 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:50.724 "is_configured": true, 00:09:50.724 "data_offset": 2048, 00:09:50.724 "data_size": 63488 00:09:50.724 }, 00:09:50.724 { 00:09:50.724 "name": "BaseBdev2", 00:09:50.724 "uuid": "4888ae60-f018-4ac5-af88-49e8247fbb07", 00:09:50.724 "is_configured": true, 00:09:50.724 "data_offset": 2048, 00:09:50.724 "data_size": 63488 00:09:50.724 } 00:09:50.724 ] 00:09:50.724 }' 00:09:50.724 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:50.724 15:48:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:51.291 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:09:51.550 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:09:51.550 15:48:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:51.550 [2024-06-10 15:48:57.036389] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:51.809 "name": "Existed_Raid", 00:09:51.809 "aliases": [ 00:09:51.809 "1827518e-5c35-490e-81d6-c0cb284cf6da" 00:09:51.809 ], 00:09:51.809 "product_name": "Raid Volume", 00:09:51.809 "block_size": 512, 00:09:51.809 "num_blocks": 126976, 00:09:51.809 "uuid": "1827518e-5c35-490e-81d6-c0cb284cf6da", 00:09:51.809 "assigned_rate_limits": { 00:09:51.809 "rw_ios_per_sec": 0, 00:09:51.809 "rw_mbytes_per_sec": 0, 00:09:51.809 "r_mbytes_per_sec": 0, 00:09:51.809 "w_mbytes_per_sec": 0 00:09:51.809 }, 00:09:51.809 "claimed": false, 00:09:51.809 "zoned": false, 00:09:51.809 "supported_io_types": { 00:09:51.809 "read": true, 00:09:51.809 "write": true, 00:09:51.809 "unmap": true, 00:09:51.809 "write_zeroes": true, 00:09:51.809 "flush": true, 00:09:51.809 "reset": true, 00:09:51.809 "compare": false, 00:09:51.809 "compare_and_write": false, 00:09:51.809 "abort": false, 00:09:51.809 "nvme_admin": false, 00:09:51.809 "nvme_io": false 00:09:51.809 }, 00:09:51.809 "memory_domains": [ 00:09:51.809 { 00:09:51.809 "dma_device_id": "system", 00:09:51.809 "dma_device_type": 1 00:09:51.809 }, 00:09:51.809 { 00:09:51.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.809 "dma_device_type": 2 00:09:51.809 }, 00:09:51.809 { 00:09:51.809 "dma_device_id": "system", 00:09:51.809 "dma_device_type": 1 00:09:51.809 }, 00:09:51.809 { 00:09:51.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:51.809 "dma_device_type": 2 00:09:51.809 } 00:09:51.809 ], 00:09:51.809 "driver_specific": { 00:09:51.809 "raid": { 00:09:51.809 "uuid": "1827518e-5c35-490e-81d6-c0cb284cf6da", 00:09:51.809 "strip_size_kb": 64, 00:09:51.809 "state": "online", 00:09:51.809 "raid_level": "raid0", 00:09:51.809 "superblock": true, 00:09:51.809 "num_base_bdevs": 2, 00:09:51.809 "num_base_bdevs_discovered": 2, 00:09:51.809 "num_base_bdevs_operational": 2, 00:09:51.809 "base_bdevs_list": [ 00:09:51.809 { 00:09:51.809 "name": "BaseBdev1", 00:09:51.809 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:51.809 "is_configured": true, 00:09:51.809 "data_offset": 2048, 00:09:51.809 "data_size": 63488 00:09:51.809 }, 00:09:51.809 { 00:09:51.809 "name": "BaseBdev2", 00:09:51.809 "uuid": "4888ae60-f018-4ac5-af88-49e8247fbb07", 00:09:51.809 "is_configured": true, 00:09:51.809 "data_offset": 2048, 00:09:51.809 "data_size": 63488 00:09:51.809 } 00:09:51.809 ] 00:09:51.809 } 00:09:51.809 } 00:09:51.809 }' 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:09:51.809 BaseBdev2' 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:09:51.809 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:52.067 "name": "BaseBdev1", 00:09:52.067 "aliases": [ 00:09:52.067 "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd" 00:09:52.067 ], 00:09:52.067 "product_name": "Malloc disk", 00:09:52.067 "block_size": 512, 00:09:52.067 "num_blocks": 65536, 00:09:52.067 "uuid": "af9b9ddc-6af2-4676-bc03-65f4d97bc9bd", 00:09:52.067 "assigned_rate_limits": { 00:09:52.067 "rw_ios_per_sec": 0, 00:09:52.067 "rw_mbytes_per_sec": 0, 00:09:52.067 "r_mbytes_per_sec": 0, 00:09:52.067 "w_mbytes_per_sec": 0 00:09:52.067 }, 00:09:52.067 "claimed": true, 00:09:52.067 "claim_type": "exclusive_write", 00:09:52.067 "zoned": false, 00:09:52.067 "supported_io_types": { 00:09:52.067 "read": true, 00:09:52.067 "write": true, 00:09:52.067 "unmap": true, 00:09:52.067 "write_zeroes": true, 00:09:52.067 "flush": true, 00:09:52.067 "reset": true, 00:09:52.067 "compare": false, 00:09:52.067 "compare_and_write": false, 00:09:52.067 "abort": true, 00:09:52.067 "nvme_admin": false, 00:09:52.067 "nvme_io": false 00:09:52.067 }, 00:09:52.067 "memory_domains": [ 00:09:52.067 { 00:09:52.067 "dma_device_id": "system", 00:09:52.067 "dma_device_type": 1 00:09:52.067 }, 00:09:52.067 { 00:09:52.067 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.067 "dma_device_type": 2 00:09:52.067 } 00:09:52.067 ], 00:09:52.067 "driver_specific": {} 00:09:52.067 }' 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:52.067 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:52.328 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:09:52.588 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:52.588 "name": "BaseBdev2", 00:09:52.588 "aliases": [ 00:09:52.589 "4888ae60-f018-4ac5-af88-49e8247fbb07" 00:09:52.589 ], 00:09:52.589 "product_name": "Malloc disk", 00:09:52.589 "block_size": 512, 00:09:52.589 "num_blocks": 65536, 00:09:52.589 "uuid": "4888ae60-f018-4ac5-af88-49e8247fbb07", 00:09:52.589 "assigned_rate_limits": { 00:09:52.589 "rw_ios_per_sec": 0, 00:09:52.589 "rw_mbytes_per_sec": 0, 00:09:52.589 "r_mbytes_per_sec": 0, 00:09:52.589 "w_mbytes_per_sec": 0 00:09:52.589 }, 00:09:52.589 "claimed": true, 00:09:52.589 "claim_type": "exclusive_write", 00:09:52.589 "zoned": false, 00:09:52.589 "supported_io_types": { 00:09:52.589 "read": true, 00:09:52.589 "write": true, 00:09:52.589 "unmap": true, 00:09:52.589 "write_zeroes": true, 00:09:52.589 "flush": true, 00:09:52.589 "reset": true, 00:09:52.589 "compare": false, 00:09:52.589 "compare_and_write": false, 00:09:52.589 "abort": true, 00:09:52.589 "nvme_admin": false, 00:09:52.589 "nvme_io": false 00:09:52.589 }, 00:09:52.589 "memory_domains": [ 00:09:52.589 { 00:09:52.589 "dma_device_id": "system", 00:09:52.589 "dma_device_type": 1 00:09:52.589 }, 00:09:52.589 { 00:09:52.589 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:52.589 "dma_device_type": 2 00:09:52.589 } 00:09:52.589 ], 00:09:52.589 "driver_specific": {} 00:09:52.589 }' 00:09:52.589 15:48:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.589 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:52.589 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:52.589 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:52.848 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:53.107 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:53.107 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:09:53.107 [2024-06-10 15:48:58.600390] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:09:53.107 [2024-06-10 15:48:58.600414] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:09:53.108 [2024-06-10 15:48:58.600454] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 1 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:53.367 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:09:53.625 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:53.625 "name": "Existed_Raid", 00:09:53.625 "uuid": "1827518e-5c35-490e-81d6-c0cb284cf6da", 00:09:53.625 "strip_size_kb": 64, 00:09:53.625 "state": "offline", 00:09:53.625 "raid_level": "raid0", 00:09:53.625 "superblock": true, 00:09:53.625 "num_base_bdevs": 2, 00:09:53.625 "num_base_bdevs_discovered": 1, 00:09:53.625 "num_base_bdevs_operational": 1, 00:09:53.625 "base_bdevs_list": [ 00:09:53.625 { 00:09:53.625 "name": null, 00:09:53.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:09:53.626 "is_configured": false, 00:09:53.626 "data_offset": 2048, 00:09:53.626 "data_size": 63488 00:09:53.626 }, 00:09:53.626 { 00:09:53.626 "name": "BaseBdev2", 00:09:53.626 "uuid": "4888ae60-f018-4ac5-af88-49e8247fbb07", 00:09:53.626 "is_configured": true, 00:09:53.626 "data_offset": 2048, 00:09:53.626 "data_size": 63488 00:09:53.626 } 00:09:53.626 ] 00:09:53.626 }' 00:09:53.626 15:48:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:53.626 15:48:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:54.192 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:09:54.192 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:54.192 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:54.192 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:09:54.450 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:09:54.450 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:09:54.450 15:48:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:09:54.709 [2024-06-10 15:48:59.985434] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:09:54.709 [2024-06-10 15:48:59.985482] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x122b770 name Existed_Raid, state offline 00:09:54.709 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:09:54.709 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:09:54.709 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:54.709 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2632632 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2632632 ']' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2632632 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2632632 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2632632' 00:09:54.968 killing process with pid 2632632 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2632632 00:09:54.968 [2024-06-10 15:49:00.316641] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:09:54.968 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2632632 00:09:54.968 [2024-06-10 15:49:00.317521] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:09:55.226 15:49:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:09:55.226 00:09:55.226 real 0m10.442s 00:09:55.226 user 0m18.996s 00:09:55.226 sys 0m1.511s 00:09:55.226 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:09:55.226 15:49:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:09:55.226 ************************************ 00:09:55.226 END TEST raid_state_function_test_sb 00:09:55.226 ************************************ 00:09:55.226 15:49:00 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 2 00:09:55.226 15:49:00 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:09:55.226 15:49:00 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:09:55.226 15:49:00 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:09:55.226 ************************************ 00:09:55.226 START TEST raid_superblock_test 00:09:55.226 ************************************ 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 2 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2634502 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2634502 /var/tmp/spdk-raid.sock 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2634502 ']' 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:09:55.226 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:09:55.226 15:49:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:55.226 [2024-06-10 15:49:00.640521] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:09:55.226 [2024-06-10 15:49:00.640576] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2634502 ] 00:09:55.485 [2024-06-10 15:49:00.740527] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:09:55.485 [2024-06-10 15:49:00.835705] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:09:55.485 [2024-06-10 15:49:00.900258] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:55.485 [2024-06-10 15:49:00.900290] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:09:56.421 malloc1 00:09:56.421 15:49:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:09:56.679 [2024-06-10 15:49:02.077933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:09:56.679 [2024-06-10 15:49:02.077982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:56.679 [2024-06-10 15:49:02.077999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e10f0 00:09:56.679 [2024-06-10 15:49:02.078009] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:56.679 [2024-06-10 15:49:02.079714] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:56.679 [2024-06-10 15:49:02.079741] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:09:56.679 pt1 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:09:56.679 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:09:56.937 malloc2 00:09:56.937 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:09:57.194 [2024-06-10 15:49:02.596138] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:09:57.194 [2024-06-10 15:49:02.596180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:09:57.194 [2024-06-10 15:49:02.596195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25e2400 00:09:57.194 [2024-06-10 15:49:02.596205] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:09:57.194 [2024-06-10 15:49:02.597787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:09:57.194 [2024-06-10 15:49:02.597813] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:09:57.194 pt2 00:09:57.194 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:09:57.194 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:09:57.194 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2' -n raid_bdev1 -s 00:09:57.452 [2024-06-10 15:49:02.848826] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:09:57.452 [2024-06-10 15:49:02.850161] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:09:57.452 [2024-06-10 15:49:02.850312] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x278de60 00:09:57.452 [2024-06-10 15:49:02.850325] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:09:57.452 [2024-06-10 15:49:02.850527] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x278ea20 00:09:57.452 [2024-06-10 15:49:02.850672] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x278de60 00:09:57.452 [2024-06-10 15:49:02.850681] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x278de60 00:09:57.452 [2024-06-10 15:49:02.850785] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:09:57.452 15:49:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:09:57.711 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:09:57.711 "name": "raid_bdev1", 00:09:57.711 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:09:57.711 "strip_size_kb": 64, 00:09:57.711 "state": "online", 00:09:57.711 "raid_level": "raid0", 00:09:57.711 "superblock": true, 00:09:57.711 "num_base_bdevs": 2, 00:09:57.711 "num_base_bdevs_discovered": 2, 00:09:57.711 "num_base_bdevs_operational": 2, 00:09:57.711 "base_bdevs_list": [ 00:09:57.711 { 00:09:57.711 "name": "pt1", 00:09:57.711 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:57.711 "is_configured": true, 00:09:57.711 "data_offset": 2048, 00:09:57.711 "data_size": 63488 00:09:57.711 }, 00:09:57.711 { 00:09:57.711 "name": "pt2", 00:09:57.711 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:57.711 "is_configured": true, 00:09:57.711 "data_offset": 2048, 00:09:57.711 "data_size": 63488 00:09:57.711 } 00:09:57.711 ] 00:09:57.711 }' 00:09:57.711 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:09:57.711 15:49:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:09:58.277 15:49:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:58.536 [2024-06-10 15:49:03.984070] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:09:58.536 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:09:58.536 "name": "raid_bdev1", 00:09:58.536 "aliases": [ 00:09:58.536 "90aae4b2-f954-4dad-bc1a-a14cb8ef7943" 00:09:58.536 ], 00:09:58.536 "product_name": "Raid Volume", 00:09:58.536 "block_size": 512, 00:09:58.536 "num_blocks": 126976, 00:09:58.536 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:09:58.536 "assigned_rate_limits": { 00:09:58.536 "rw_ios_per_sec": 0, 00:09:58.536 "rw_mbytes_per_sec": 0, 00:09:58.536 "r_mbytes_per_sec": 0, 00:09:58.536 "w_mbytes_per_sec": 0 00:09:58.536 }, 00:09:58.536 "claimed": false, 00:09:58.536 "zoned": false, 00:09:58.536 "supported_io_types": { 00:09:58.536 "read": true, 00:09:58.536 "write": true, 00:09:58.536 "unmap": true, 00:09:58.536 "write_zeroes": true, 00:09:58.536 "flush": true, 00:09:58.536 "reset": true, 00:09:58.536 "compare": false, 00:09:58.536 "compare_and_write": false, 00:09:58.536 "abort": false, 00:09:58.536 "nvme_admin": false, 00:09:58.536 "nvme_io": false 00:09:58.536 }, 00:09:58.536 "memory_domains": [ 00:09:58.536 { 00:09:58.536 "dma_device_id": "system", 00:09:58.536 "dma_device_type": 1 00:09:58.536 }, 00:09:58.536 { 00:09:58.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:58.536 "dma_device_type": 2 00:09:58.536 }, 00:09:58.536 { 00:09:58.536 "dma_device_id": "system", 00:09:58.536 "dma_device_type": 1 00:09:58.536 }, 00:09:58.536 { 00:09:58.536 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:58.536 "dma_device_type": 2 00:09:58.536 } 00:09:58.536 ], 00:09:58.536 "driver_specific": { 00:09:58.536 "raid": { 00:09:58.536 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:09:58.536 "strip_size_kb": 64, 00:09:58.536 "state": "online", 00:09:58.536 "raid_level": "raid0", 00:09:58.536 "superblock": true, 00:09:58.536 "num_base_bdevs": 2, 00:09:58.536 "num_base_bdevs_discovered": 2, 00:09:58.536 "num_base_bdevs_operational": 2, 00:09:58.536 "base_bdevs_list": [ 00:09:58.536 { 00:09:58.536 "name": "pt1", 00:09:58.536 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:58.536 "is_configured": true, 00:09:58.536 "data_offset": 2048, 00:09:58.536 "data_size": 63488 00:09:58.536 }, 00:09:58.536 { 00:09:58.536 "name": "pt2", 00:09:58.536 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:58.536 "is_configured": true, 00:09:58.536 "data_offset": 2048, 00:09:58.536 "data_size": 63488 00:09:58.536 } 00:09:58.536 ] 00:09:58.536 } 00:09:58.536 } 00:09:58.536 }' 00:09:58.536 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:09:58.795 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:09:58.795 pt2' 00:09:58.795 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:58.795 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:09:58.795 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:59.053 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:59.053 "name": "pt1", 00:09:59.053 "aliases": [ 00:09:59.053 "00000000-0000-0000-0000-000000000001" 00:09:59.053 ], 00:09:59.053 "product_name": "passthru", 00:09:59.053 "block_size": 512, 00:09:59.053 "num_blocks": 65536, 00:09:59.053 "uuid": "00000000-0000-0000-0000-000000000001", 00:09:59.053 "assigned_rate_limits": { 00:09:59.053 "rw_ios_per_sec": 0, 00:09:59.053 "rw_mbytes_per_sec": 0, 00:09:59.053 "r_mbytes_per_sec": 0, 00:09:59.053 "w_mbytes_per_sec": 0 00:09:59.053 }, 00:09:59.053 "claimed": true, 00:09:59.053 "claim_type": "exclusive_write", 00:09:59.053 "zoned": false, 00:09:59.053 "supported_io_types": { 00:09:59.053 "read": true, 00:09:59.053 "write": true, 00:09:59.053 "unmap": true, 00:09:59.053 "write_zeroes": true, 00:09:59.053 "flush": true, 00:09:59.053 "reset": true, 00:09:59.053 "compare": false, 00:09:59.053 "compare_and_write": false, 00:09:59.053 "abort": true, 00:09:59.053 "nvme_admin": false, 00:09:59.053 "nvme_io": false 00:09:59.053 }, 00:09:59.053 "memory_domains": [ 00:09:59.053 { 00:09:59.053 "dma_device_id": "system", 00:09:59.053 "dma_device_type": 1 00:09:59.053 }, 00:09:59.053 { 00:09:59.053 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.053 "dma_device_type": 2 00:09:59.053 } 00:09:59.054 ], 00:09:59.054 "driver_specific": { 00:09:59.054 "passthru": { 00:09:59.054 "name": "pt1", 00:09:59.054 "base_bdev_name": "malloc1" 00:09:59.054 } 00:09:59.054 } 00:09:59.054 }' 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:59.054 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:59.312 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:59.312 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:59.312 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:59.313 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:59.313 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:09:59.313 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:09:59.313 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:09:59.571 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:09:59.571 "name": "pt2", 00:09:59.571 "aliases": [ 00:09:59.571 "00000000-0000-0000-0000-000000000002" 00:09:59.571 ], 00:09:59.571 "product_name": "passthru", 00:09:59.571 "block_size": 512, 00:09:59.571 "num_blocks": 65536, 00:09:59.571 "uuid": "00000000-0000-0000-0000-000000000002", 00:09:59.571 "assigned_rate_limits": { 00:09:59.571 "rw_ios_per_sec": 0, 00:09:59.571 "rw_mbytes_per_sec": 0, 00:09:59.571 "r_mbytes_per_sec": 0, 00:09:59.571 "w_mbytes_per_sec": 0 00:09:59.571 }, 00:09:59.571 "claimed": true, 00:09:59.571 "claim_type": "exclusive_write", 00:09:59.571 "zoned": false, 00:09:59.571 "supported_io_types": { 00:09:59.571 "read": true, 00:09:59.571 "write": true, 00:09:59.571 "unmap": true, 00:09:59.571 "write_zeroes": true, 00:09:59.571 "flush": true, 00:09:59.571 "reset": true, 00:09:59.571 "compare": false, 00:09:59.571 "compare_and_write": false, 00:09:59.571 "abort": true, 00:09:59.571 "nvme_admin": false, 00:09:59.571 "nvme_io": false 00:09:59.571 }, 00:09:59.571 "memory_domains": [ 00:09:59.571 { 00:09:59.571 "dma_device_id": "system", 00:09:59.571 "dma_device_type": 1 00:09:59.571 }, 00:09:59.571 { 00:09:59.571 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:09:59.571 "dma_device_type": 2 00:09:59.571 } 00:09:59.571 ], 00:09:59.571 "driver_specific": { 00:09:59.571 "passthru": { 00:09:59.571 "name": "pt2", 00:09:59.571 "base_bdev_name": "malloc2" 00:09:59.571 } 00:09:59.571 } 00:09:59.571 }' 00:09:59.571 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:59.571 15:49:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:09:59.571 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:09:59.571 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:59.571 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:09:59.832 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:00.151 [2024-06-10 15:49:05.508127] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:00.151 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=90aae4b2-f954-4dad-bc1a-a14cb8ef7943 00:10:00.151 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 90aae4b2-f954-4dad-bc1a-a14cb8ef7943 ']' 00:10:00.151 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:00.409 [2024-06-10 15:49:05.760565] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:00.409 [2024-06-10 15:49:05.760587] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:00.409 [2024-06-10 15:49:05.760639] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:00.409 [2024-06-10 15:49:05.760681] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:00.409 [2024-06-10 15:49:05.760690] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278de60 name raid_bdev1, state offline 00:10:00.409 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:00.409 15:49:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:00.668 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:00.668 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:00.668 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:00.668 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:00.926 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:00.926 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:01.185 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:01.444 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2' -n raid_bdev1 00:10:01.444 [2024-06-10 15:49:06.935644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:01.444 [2024-06-10 15:49:06.937066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:01.444 [2024-06-10 15:49:06.937121] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:01.444 [2024-06-10 15:49:06.937157] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:01.444 [2024-06-10 15:49:06.937173] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:01.444 [2024-06-10 15:49:06.937180] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x278b520 name raid_bdev1, state configuring 00:10:01.444 request: 00:10:01.444 { 00:10:01.444 "name": "raid_bdev1", 00:10:01.444 "raid_level": "raid0", 00:10:01.444 "base_bdevs": [ 00:10:01.444 "malloc1", 00:10:01.444 "malloc2" 00:10:01.444 ], 00:10:01.444 "superblock": false, 00:10:01.444 "strip_size_kb": 64, 00:10:01.444 "method": "bdev_raid_create", 00:10:01.444 "req_id": 1 00:10:01.444 } 00:10:01.444 Got JSON-RPC error response 00:10:01.444 response: 00:10:01.444 { 00:10:01.444 "code": -17, 00:10:01.444 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:01.444 } 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:01.703 15:49:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:01.961 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:01.961 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:01.961 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:01.961 [2024-06-10 15:49:07.452968] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:01.961 [2024-06-10 15:49:07.452998] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:01.961 [2024-06-10 15:49:07.453012] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278e9c0 00:10:01.961 [2024-06-10 15:49:07.453026] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:01.962 [2024-06-10 15:49:07.454530] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:01.962 [2024-06-10 15:49:07.454554] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:01.962 [2024-06-10 15:49:07.454608] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:01.962 [2024-06-10 15:49:07.454630] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:01.962 pt1 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 2 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:02.220 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:02.479 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:02.479 "name": "raid_bdev1", 00:10:02.479 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:10:02.479 "strip_size_kb": 64, 00:10:02.479 "state": "configuring", 00:10:02.479 "raid_level": "raid0", 00:10:02.479 "superblock": true, 00:10:02.479 "num_base_bdevs": 2, 00:10:02.479 "num_base_bdevs_discovered": 1, 00:10:02.479 "num_base_bdevs_operational": 2, 00:10:02.479 "base_bdevs_list": [ 00:10:02.479 { 00:10:02.479 "name": "pt1", 00:10:02.479 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:02.479 "is_configured": true, 00:10:02.479 "data_offset": 2048, 00:10:02.479 "data_size": 63488 00:10:02.479 }, 00:10:02.479 { 00:10:02.479 "name": null, 00:10:02.479 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:02.479 "is_configured": false, 00:10:02.479 "data_offset": 2048, 00:10:02.479 "data_size": 63488 00:10:02.479 } 00:10:02.479 ] 00:10:02.479 }' 00:10:02.479 15:49:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:02.479 15:49:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:03.047 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:03.047 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:03.047 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:03.047 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:03.306 [2024-06-10 15:49:08.579983] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:03.306 [2024-06-10 15:49:08.580027] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:03.306 [2024-06-10 15:49:08.580042] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x278a3a0 00:10:03.306 [2024-06-10 15:49:08.580052] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:03.306 [2024-06-10 15:49:08.580395] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:03.306 [2024-06-10 15:49:08.580410] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:03.306 [2024-06-10 15:49:08.580467] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:03.306 [2024-06-10 15:49:08.580484] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:03.306 [2024-06-10 15:49:08.580588] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2792110 00:10:03.306 [2024-06-10 15:49:08.580597] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:03.306 [2024-06-10 15:49:08.580776] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27902e0 00:10:03.306 [2024-06-10 15:49:08.580899] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2792110 00:10:03.306 [2024-06-10 15:49:08.580907] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2792110 00:10:03.306 [2024-06-10 15:49:08.581017] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:03.306 pt2 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:03.306 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:03.565 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:03.565 "name": "raid_bdev1", 00:10:03.565 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:10:03.565 "strip_size_kb": 64, 00:10:03.565 "state": "online", 00:10:03.565 "raid_level": "raid0", 00:10:03.565 "superblock": true, 00:10:03.565 "num_base_bdevs": 2, 00:10:03.565 "num_base_bdevs_discovered": 2, 00:10:03.565 "num_base_bdevs_operational": 2, 00:10:03.565 "base_bdevs_list": [ 00:10:03.565 { 00:10:03.565 "name": "pt1", 00:10:03.565 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:03.565 "is_configured": true, 00:10:03.565 "data_offset": 2048, 00:10:03.565 "data_size": 63488 00:10:03.565 }, 00:10:03.565 { 00:10:03.565 "name": "pt2", 00:10:03.565 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:03.565 "is_configured": true, 00:10:03.565 "data_offset": 2048, 00:10:03.565 "data_size": 63488 00:10:03.565 } 00:10:03.565 ] 00:10:03.565 }' 00:10:03.565 15:49:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:03.565 15:49:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:04.132 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:04.391 [2024-06-10 15:49:09.711244] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:04.391 "name": "raid_bdev1", 00:10:04.391 "aliases": [ 00:10:04.391 "90aae4b2-f954-4dad-bc1a-a14cb8ef7943" 00:10:04.391 ], 00:10:04.391 "product_name": "Raid Volume", 00:10:04.391 "block_size": 512, 00:10:04.391 "num_blocks": 126976, 00:10:04.391 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:10:04.391 "assigned_rate_limits": { 00:10:04.391 "rw_ios_per_sec": 0, 00:10:04.391 "rw_mbytes_per_sec": 0, 00:10:04.391 "r_mbytes_per_sec": 0, 00:10:04.391 "w_mbytes_per_sec": 0 00:10:04.391 }, 00:10:04.391 "claimed": false, 00:10:04.391 "zoned": false, 00:10:04.391 "supported_io_types": { 00:10:04.391 "read": true, 00:10:04.391 "write": true, 00:10:04.391 "unmap": true, 00:10:04.391 "write_zeroes": true, 00:10:04.391 "flush": true, 00:10:04.391 "reset": true, 00:10:04.391 "compare": false, 00:10:04.391 "compare_and_write": false, 00:10:04.391 "abort": false, 00:10:04.391 "nvme_admin": false, 00:10:04.391 "nvme_io": false 00:10:04.391 }, 00:10:04.391 "memory_domains": [ 00:10:04.391 { 00:10:04.391 "dma_device_id": "system", 00:10:04.391 "dma_device_type": 1 00:10:04.391 }, 00:10:04.391 { 00:10:04.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.391 "dma_device_type": 2 00:10:04.391 }, 00:10:04.391 { 00:10:04.391 "dma_device_id": "system", 00:10:04.391 "dma_device_type": 1 00:10:04.391 }, 00:10:04.391 { 00:10:04.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.391 "dma_device_type": 2 00:10:04.391 } 00:10:04.391 ], 00:10:04.391 "driver_specific": { 00:10:04.391 "raid": { 00:10:04.391 "uuid": "90aae4b2-f954-4dad-bc1a-a14cb8ef7943", 00:10:04.391 "strip_size_kb": 64, 00:10:04.391 "state": "online", 00:10:04.391 "raid_level": "raid0", 00:10:04.391 "superblock": true, 00:10:04.391 "num_base_bdevs": 2, 00:10:04.391 "num_base_bdevs_discovered": 2, 00:10:04.391 "num_base_bdevs_operational": 2, 00:10:04.391 "base_bdevs_list": [ 00:10:04.391 { 00:10:04.391 "name": "pt1", 00:10:04.391 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:04.391 "is_configured": true, 00:10:04.391 "data_offset": 2048, 00:10:04.391 "data_size": 63488 00:10:04.391 }, 00:10:04.391 { 00:10:04.391 "name": "pt2", 00:10:04.391 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:04.391 "is_configured": true, 00:10:04.391 "data_offset": 2048, 00:10:04.391 "data_size": 63488 00:10:04.391 } 00:10:04.391 ] 00:10:04.391 } 00:10:04.391 } 00:10:04.391 }' 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:04.391 pt2' 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:04.391 15:49:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:04.650 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:04.650 "name": "pt1", 00:10:04.650 "aliases": [ 00:10:04.650 "00000000-0000-0000-0000-000000000001" 00:10:04.650 ], 00:10:04.650 "product_name": "passthru", 00:10:04.650 "block_size": 512, 00:10:04.650 "num_blocks": 65536, 00:10:04.650 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:04.650 "assigned_rate_limits": { 00:10:04.650 "rw_ios_per_sec": 0, 00:10:04.650 "rw_mbytes_per_sec": 0, 00:10:04.650 "r_mbytes_per_sec": 0, 00:10:04.650 "w_mbytes_per_sec": 0 00:10:04.650 }, 00:10:04.650 "claimed": true, 00:10:04.650 "claim_type": "exclusive_write", 00:10:04.650 "zoned": false, 00:10:04.650 "supported_io_types": { 00:10:04.650 "read": true, 00:10:04.650 "write": true, 00:10:04.650 "unmap": true, 00:10:04.650 "write_zeroes": true, 00:10:04.650 "flush": true, 00:10:04.650 "reset": true, 00:10:04.650 "compare": false, 00:10:04.650 "compare_and_write": false, 00:10:04.650 "abort": true, 00:10:04.650 "nvme_admin": false, 00:10:04.650 "nvme_io": false 00:10:04.650 }, 00:10:04.650 "memory_domains": [ 00:10:04.650 { 00:10:04.650 "dma_device_id": "system", 00:10:04.650 "dma_device_type": 1 00:10:04.650 }, 00:10:04.650 { 00:10:04.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:04.650 "dma_device_type": 2 00:10:04.650 } 00:10:04.650 ], 00:10:04.650 "driver_specific": { 00:10:04.650 "passthru": { 00:10:04.650 "name": "pt1", 00:10:04.650 "base_bdev_name": "malloc1" 00:10:04.650 } 00:10:04.650 } 00:10:04.650 }' 00:10:04.650 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:04.650 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:04.650 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:04.650 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:04.909 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:05.167 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:05.167 "name": "pt2", 00:10:05.167 "aliases": [ 00:10:05.167 "00000000-0000-0000-0000-000000000002" 00:10:05.167 ], 00:10:05.167 "product_name": "passthru", 00:10:05.167 "block_size": 512, 00:10:05.167 "num_blocks": 65536, 00:10:05.167 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:05.167 "assigned_rate_limits": { 00:10:05.167 "rw_ios_per_sec": 0, 00:10:05.167 "rw_mbytes_per_sec": 0, 00:10:05.167 "r_mbytes_per_sec": 0, 00:10:05.167 "w_mbytes_per_sec": 0 00:10:05.167 }, 00:10:05.167 "claimed": true, 00:10:05.167 "claim_type": "exclusive_write", 00:10:05.167 "zoned": false, 00:10:05.167 "supported_io_types": { 00:10:05.167 "read": true, 00:10:05.167 "write": true, 00:10:05.167 "unmap": true, 00:10:05.167 "write_zeroes": true, 00:10:05.167 "flush": true, 00:10:05.167 "reset": true, 00:10:05.167 "compare": false, 00:10:05.167 "compare_and_write": false, 00:10:05.167 "abort": true, 00:10:05.167 "nvme_admin": false, 00:10:05.167 "nvme_io": false 00:10:05.167 }, 00:10:05.167 "memory_domains": [ 00:10:05.167 { 00:10:05.167 "dma_device_id": "system", 00:10:05.167 "dma_device_type": 1 00:10:05.167 }, 00:10:05.167 { 00:10:05.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:05.167 "dma_device_type": 2 00:10:05.167 } 00:10:05.167 ], 00:10:05.167 "driver_specific": { 00:10:05.167 "passthru": { 00:10:05.167 "name": "pt2", 00:10:05.167 "base_bdev_name": "malloc2" 00:10:05.167 } 00:10:05.167 } 00:10:05.167 }' 00:10:05.167 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:05.426 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.684 15:49:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:05.684 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:05.684 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:05.685 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:05.943 [2024-06-10 15:49:11.243364] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 90aae4b2-f954-4dad-bc1a-a14cb8ef7943 '!=' 90aae4b2-f954-4dad-bc1a-a14cb8ef7943 ']' 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2634502 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2634502 ']' 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2634502 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2634502 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2634502' 00:10:05.943 killing process with pid 2634502 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2634502 00:10:05.943 [2024-06-10 15:49:11.301510] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:05.943 [2024-06-10 15:49:11.301562] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:05.943 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2634502 00:10:05.943 [2024-06-10 15:49:11.301605] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:05.943 [2024-06-10 15:49:11.301614] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2792110 name raid_bdev1, state offline 00:10:05.943 [2024-06-10 15:49:11.317977] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:06.203 15:49:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:06.203 00:10:06.203 real 0m10.930s 00:10:06.203 user 0m20.000s 00:10:06.203 sys 0m1.554s 00:10:06.203 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:06.203 15:49:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.203 ************************************ 00:10:06.203 END TEST raid_superblock_test 00:10:06.203 ************************************ 00:10:06.203 15:49:11 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 2 read 00:10:06.203 15:49:11 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:06.203 15:49:11 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:06.203 15:49:11 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:06.203 ************************************ 00:10:06.203 START TEST raid_read_error_test 00:10:06.203 ************************************ 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 read 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.nfgay8yOnr 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2636540 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2636540 /var/tmp/spdk-raid.sock 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2636540 ']' 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:06.203 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:06.203 15:49:11 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:06.203 [2024-06-10 15:49:11.647042] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:06.203 [2024-06-10 15:49:11.647095] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2636540 ] 00:10:06.461 [2024-06-10 15:49:11.744258] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:06.461 [2024-06-10 15:49:11.839494] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:06.461 [2024-06-10 15:49:11.901218] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:06.461 [2024-06-10 15:49:11.901251] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:07.027 15:49:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:07.027 15:49:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:07.027 15:49:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:07.027 15:49:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:07.287 BaseBdev1_malloc 00:10:07.287 15:49:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:07.546 true 00:10:07.546 15:49:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:07.546 [2024-06-10 15:49:13.055531] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:07.546 [2024-06-10 15:49:13.055569] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:07.546 [2024-06-10 15:49:13.055586] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xaff150 00:10:07.546 [2024-06-10 15:49:13.055595] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:07.804 [2024-06-10 15:49:13.057408] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:07.804 [2024-06-10 15:49:13.057435] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:07.804 BaseBdev1 00:10:07.804 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:07.804 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:07.804 BaseBdev2_malloc 00:10:08.062 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:08.062 true 00:10:08.062 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:08.320 [2024-06-10 15:49:13.721930] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:08.320 [2024-06-10 15:49:13.721975] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:08.320 [2024-06-10 15:49:13.721992] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb03b50 00:10:08.320 [2024-06-10 15:49:13.722001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:08.320 [2024-06-10 15:49:13.723590] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:08.320 [2024-06-10 15:49:13.723614] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:08.320 BaseBdev2 00:10:08.320 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:08.579 [2024-06-10 15:49:13.882369] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:08.579 [2024-06-10 15:49:13.883722] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:08.579 [2024-06-10 15:49:13.883909] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb04e30 00:10:08.579 [2024-06-10 15:49:13.883921] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:08.579 [2024-06-10 15:49:13.884123] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb05110 00:10:08.579 [2024-06-10 15:49:13.884274] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb04e30 00:10:08.579 [2024-06-10 15:49:13.884283] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb04e30 00:10:08.579 [2024-06-10 15:49:13.884389] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:08.579 15:49:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:08.838 15:49:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:08.838 "name": "raid_bdev1", 00:10:08.838 "uuid": "16119de1-03d2-47fb-947a-2e0e70e19bd9", 00:10:08.838 "strip_size_kb": 64, 00:10:08.838 "state": "online", 00:10:08.838 "raid_level": "raid0", 00:10:08.838 "superblock": true, 00:10:08.838 "num_base_bdevs": 2, 00:10:08.838 "num_base_bdevs_discovered": 2, 00:10:08.838 "num_base_bdevs_operational": 2, 00:10:08.838 "base_bdevs_list": [ 00:10:08.838 { 00:10:08.838 "name": "BaseBdev1", 00:10:08.838 "uuid": "698f9332-db1d-521a-ac5c-01850cee8ef8", 00:10:08.838 "is_configured": true, 00:10:08.838 "data_offset": 2048, 00:10:08.838 "data_size": 63488 00:10:08.838 }, 00:10:08.838 { 00:10:08.838 "name": "BaseBdev2", 00:10:08.838 "uuid": "93aaa51d-49ee-5ec8-9614-4644454531b2", 00:10:08.838 "is_configured": true, 00:10:08.838 "data_offset": 2048, 00:10:08.838 "data_size": 63488 00:10:08.838 } 00:10:08.838 ] 00:10:08.838 }' 00:10:08.838 15:49:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:08.838 15:49:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:09.406 15:49:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:09.406 15:49:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:09.406 [2024-06-10 15:49:14.881342] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x954c20 00:10:10.344 15:49:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:10.602 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:10.602 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:10.603 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:10.861 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:10.861 "name": "raid_bdev1", 00:10:10.861 "uuid": "16119de1-03d2-47fb-947a-2e0e70e19bd9", 00:10:10.861 "strip_size_kb": 64, 00:10:10.861 "state": "online", 00:10:10.861 "raid_level": "raid0", 00:10:10.861 "superblock": true, 00:10:10.861 "num_base_bdevs": 2, 00:10:10.861 "num_base_bdevs_discovered": 2, 00:10:10.861 "num_base_bdevs_operational": 2, 00:10:10.861 "base_bdevs_list": [ 00:10:10.861 { 00:10:10.861 "name": "BaseBdev1", 00:10:10.861 "uuid": "698f9332-db1d-521a-ac5c-01850cee8ef8", 00:10:10.861 "is_configured": true, 00:10:10.861 "data_offset": 2048, 00:10:10.861 "data_size": 63488 00:10:10.861 }, 00:10:10.861 { 00:10:10.861 "name": "BaseBdev2", 00:10:10.861 "uuid": "93aaa51d-49ee-5ec8-9614-4644454531b2", 00:10:10.861 "is_configured": true, 00:10:10.861 "data_offset": 2048, 00:10:10.861 "data_size": 63488 00:10:10.861 } 00:10:10.861 ] 00:10:10.861 }' 00:10:10.861 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:10.861 15:49:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:11.430 15:49:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:11.689 [2024-06-10 15:49:17.140148] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:11.689 [2024-06-10 15:49:17.140188] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:11.689 [2024-06-10 15:49:17.143600] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:11.689 [2024-06-10 15:49:17.143632] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:11.689 [2024-06-10 15:49:17.143657] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:11.689 [2024-06-10 15:49:17.143671] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb04e30 name raid_bdev1, state offline 00:10:11.689 0 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2636540 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2636540 ']' 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2636540 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:11.689 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2636540 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2636540' 00:10:11.948 killing process with pid 2636540 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2636540 00:10:11.948 [2024-06-10 15:49:17.209782] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2636540 00:10:11.948 [2024-06-10 15:49:17.219879] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.nfgay8yOnr 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:10:11.948 00:10:11.948 real 0m5.852s 00:10:11.948 user 0m9.268s 00:10:11.948 sys 0m0.838s 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:11.948 15:49:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:11.948 ************************************ 00:10:11.948 END TEST raid_read_error_test 00:10:11.948 ************************************ 00:10:12.207 15:49:17 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 2 write 00:10:12.207 15:49:17 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:12.207 15:49:17 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:12.207 15:49:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:12.207 ************************************ 00:10:12.207 START TEST raid_write_error_test 00:10:12.207 ************************************ 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 2 write 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HeNIZcmmrN 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2637667 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2637667 /var/tmp/spdk-raid.sock 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2637667 ']' 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:12.207 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:12.207 15:49:17 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:12.207 [2024-06-10 15:49:17.571476] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:12.207 [2024-06-10 15:49:17.571533] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2637667 ] 00:10:12.207 [2024-06-10 15:49:17.672404] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:12.466 [2024-06-10 15:49:17.763225] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:12.466 [2024-06-10 15:49:17.817750] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:12.466 [2024-06-10 15:49:17.817786] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:13.032 15:49:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:13.032 15:49:18 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:13.032 15:49:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:13.032 15:49:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:13.290 BaseBdev1_malloc 00:10:13.290 15:49:18 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:13.547 true 00:10:13.547 15:49:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:13.806 [2024-06-10 15:49:19.291009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:13.806 [2024-06-10 15:49:19.291058] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:13.806 [2024-06-10 15:49:19.291073] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2614150 00:10:13.806 [2024-06-10 15:49:19.291083] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:13.806 [2024-06-10 15:49:19.292809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:13.806 [2024-06-10 15:49:19.292835] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:13.806 BaseBdev1 00:10:13.806 15:49:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:13.806 15:49:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:14.065 BaseBdev2_malloc 00:10:14.324 15:49:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:14.324 true 00:10:14.324 15:49:19 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:14.583 [2024-06-10 15:49:20.069630] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:14.583 [2024-06-10 15:49:20.069669] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:14.583 [2024-06-10 15:49:20.069685] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2618b50 00:10:14.583 [2024-06-10 15:49:20.069695] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:14.583 [2024-06-10 15:49:20.071226] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:14.583 [2024-06-10 15:49:20.071253] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:14.583 BaseBdev2 00:10:14.584 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:14.843 [2024-06-10 15:49:20.326339] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:14.843 [2024-06-10 15:49:20.327711] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:14.843 [2024-06-10 15:49:20.327897] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2619e30 00:10:14.843 [2024-06-10 15:49:20.327909] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:14.843 [2024-06-10 15:49:20.328115] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x261a110 00:10:14.843 [2024-06-10 15:49:20.328262] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2619e30 00:10:14.843 [2024-06-10 15:49:20.328271] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2619e30 00:10:14.843 [2024-06-10 15:49:20.328374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:14.843 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:15.101 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:15.102 "name": "raid_bdev1", 00:10:15.102 "uuid": "5f9bb1da-9242-4b6f-bb8b-c321bd1b1357", 00:10:15.102 "strip_size_kb": 64, 00:10:15.102 "state": "online", 00:10:15.102 "raid_level": "raid0", 00:10:15.102 "superblock": true, 00:10:15.102 "num_base_bdevs": 2, 00:10:15.102 "num_base_bdevs_discovered": 2, 00:10:15.102 "num_base_bdevs_operational": 2, 00:10:15.102 "base_bdevs_list": [ 00:10:15.102 { 00:10:15.102 "name": "BaseBdev1", 00:10:15.102 "uuid": "a88cfbea-deca-51b5-bf38-dcce019dcc2c", 00:10:15.102 "is_configured": true, 00:10:15.102 "data_offset": 2048, 00:10:15.102 "data_size": 63488 00:10:15.102 }, 00:10:15.102 { 00:10:15.102 "name": "BaseBdev2", 00:10:15.102 "uuid": "b524b8be-08d5-568e-a925-5f03c9bc5972", 00:10:15.102 "is_configured": true, 00:10:15.102 "data_offset": 2048, 00:10:15.102 "data_size": 63488 00:10:15.102 } 00:10:15.102 ] 00:10:15.102 }' 00:10:15.102 15:49:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:15.102 15:49:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:16.037 15:49:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:16.037 15:49:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:16.037 [2024-06-10 15:49:21.333277] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2469c20 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 2 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:17.010 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:17.268 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:17.268 "name": "raid_bdev1", 00:10:17.268 "uuid": "5f9bb1da-9242-4b6f-bb8b-c321bd1b1357", 00:10:17.268 "strip_size_kb": 64, 00:10:17.268 "state": "online", 00:10:17.268 "raid_level": "raid0", 00:10:17.268 "superblock": true, 00:10:17.268 "num_base_bdevs": 2, 00:10:17.268 "num_base_bdevs_discovered": 2, 00:10:17.268 "num_base_bdevs_operational": 2, 00:10:17.268 "base_bdevs_list": [ 00:10:17.268 { 00:10:17.268 "name": "BaseBdev1", 00:10:17.268 "uuid": "a88cfbea-deca-51b5-bf38-dcce019dcc2c", 00:10:17.268 "is_configured": true, 00:10:17.268 "data_offset": 2048, 00:10:17.268 "data_size": 63488 00:10:17.268 }, 00:10:17.268 { 00:10:17.268 "name": "BaseBdev2", 00:10:17.268 "uuid": "b524b8be-08d5-568e-a925-5f03c9bc5972", 00:10:17.268 "is_configured": true, 00:10:17.268 "data_offset": 2048, 00:10:17.268 "data_size": 63488 00:10:17.268 } 00:10:17.268 ] 00:10:17.268 }' 00:10:17.268 15:49:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:17.268 15:49:22 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:18.202 [2024-06-10 15:49:23.607974] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:18.202 [2024-06-10 15:49:23.608016] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:18.202 [2024-06-10 15:49:23.611416] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:18.202 [2024-06-10 15:49:23.611450] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:18.202 [2024-06-10 15:49:23.611476] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:18.202 [2024-06-10 15:49:23.611484] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2619e30 name raid_bdev1, state offline 00:10:18.202 0 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2637667 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2637667 ']' 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2637667 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2637667 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2637667' 00:10:18.202 killing process with pid 2637667 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2637667 00:10:18.202 [2024-06-10 15:49:23.672235] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:18.202 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2637667 00:10:18.202 [2024-06-10 15:49:23.682112] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:18.460 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HeNIZcmmrN 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:10:18.461 00:10:18.461 real 0m6.396s 00:10:18.461 user 0m10.274s 00:10:18.461 sys 0m0.915s 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:18.461 15:49:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.461 ************************************ 00:10:18.461 END TEST raid_write_error_test 00:10:18.461 ************************************ 00:10:18.461 15:49:23 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:10:18.461 15:49:23 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 2 false 00:10:18.461 15:49:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:18.461 15:49:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:18.461 15:49:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:18.461 ************************************ 00:10:18.461 START TEST raid_state_function_test 00:10:18.461 ************************************ 00:10:18.461 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 false 00:10:18.461 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:18.461 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:18.461 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:10:18.461 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2638784 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2638784' 00:10:18.719 Process raid pid: 2638784 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2638784 /var/tmp/spdk-raid.sock 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2638784 ']' 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:18.719 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:18.719 15:49:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:18.719 [2024-06-10 15:49:24.030660] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:18.720 [2024-06-10 15:49:24.030715] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:18.720 [2024-06-10 15:49:24.131178] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:18.720 [2024-06-10 15:49:24.229114] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:18.978 [2024-06-10 15:49:24.291606] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:18.979 [2024-06-10 15:49:24.291635] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:19.545 15:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:19.545 15:49:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:10:19.546 15:49:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:19.805 [2024-06-10 15:49:25.219719] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:19.805 [2024-06-10 15:49:25.219760] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:19.805 [2024-06-10 15:49:25.219769] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:19.805 [2024-06-10 15:49:25.219778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:19.805 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:20.063 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:20.063 "name": "Existed_Raid", 00:10:20.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.063 "strip_size_kb": 64, 00:10:20.063 "state": "configuring", 00:10:20.063 "raid_level": "concat", 00:10:20.063 "superblock": false, 00:10:20.063 "num_base_bdevs": 2, 00:10:20.063 "num_base_bdevs_discovered": 0, 00:10:20.063 "num_base_bdevs_operational": 2, 00:10:20.063 "base_bdevs_list": [ 00:10:20.063 { 00:10:20.063 "name": "BaseBdev1", 00:10:20.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.063 "is_configured": false, 00:10:20.063 "data_offset": 0, 00:10:20.063 "data_size": 0 00:10:20.063 }, 00:10:20.063 { 00:10:20.063 "name": "BaseBdev2", 00:10:20.063 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:20.063 "is_configured": false, 00:10:20.063 "data_offset": 0, 00:10:20.063 "data_size": 0 00:10:20.063 } 00:10:20.063 ] 00:10:20.063 }' 00:10:20.063 15:49:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:20.063 15:49:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:20.629 15:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:20.887 [2024-06-10 15:49:26.266361] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:20.887 [2024-06-10 15:49:26.266389] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca7120 name Existed_Raid, state configuring 00:10:20.887 15:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:21.145 [2024-06-10 15:49:26.523055] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:21.145 [2024-06-10 15:49:26.523078] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:21.145 [2024-06-10 15:49:26.523086] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:21.145 [2024-06-10 15:49:26.523095] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:21.145 15:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:21.403 [2024-06-10 15:49:26.793194] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:21.403 BaseBdev1 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:21.403 15:49:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:21.660 15:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:21.918 [ 00:10:21.918 { 00:10:21.918 "name": "BaseBdev1", 00:10:21.918 "aliases": [ 00:10:21.918 "bf872cb7-8eb7-45f8-bb28-8d8a397cf981" 00:10:21.918 ], 00:10:21.918 "product_name": "Malloc disk", 00:10:21.918 "block_size": 512, 00:10:21.918 "num_blocks": 65536, 00:10:21.918 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:21.918 "assigned_rate_limits": { 00:10:21.918 "rw_ios_per_sec": 0, 00:10:21.918 "rw_mbytes_per_sec": 0, 00:10:21.918 "r_mbytes_per_sec": 0, 00:10:21.918 "w_mbytes_per_sec": 0 00:10:21.918 }, 00:10:21.919 "claimed": true, 00:10:21.919 "claim_type": "exclusive_write", 00:10:21.919 "zoned": false, 00:10:21.919 "supported_io_types": { 00:10:21.919 "read": true, 00:10:21.919 "write": true, 00:10:21.919 "unmap": true, 00:10:21.919 "write_zeroes": true, 00:10:21.919 "flush": true, 00:10:21.919 "reset": true, 00:10:21.919 "compare": false, 00:10:21.919 "compare_and_write": false, 00:10:21.919 "abort": true, 00:10:21.919 "nvme_admin": false, 00:10:21.919 "nvme_io": false 00:10:21.919 }, 00:10:21.919 "memory_domains": [ 00:10:21.919 { 00:10:21.919 "dma_device_id": "system", 00:10:21.919 "dma_device_type": 1 00:10:21.919 }, 00:10:21.919 { 00:10:21.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:21.919 "dma_device_type": 2 00:10:21.919 } 00:10:21.919 ], 00:10:21.919 "driver_specific": {} 00:10:21.919 } 00:10:21.919 ] 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:21.919 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:22.177 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:22.177 "name": "Existed_Raid", 00:10:22.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.177 "strip_size_kb": 64, 00:10:22.177 "state": "configuring", 00:10:22.177 "raid_level": "concat", 00:10:22.177 "superblock": false, 00:10:22.177 "num_base_bdevs": 2, 00:10:22.177 "num_base_bdevs_discovered": 1, 00:10:22.177 "num_base_bdevs_operational": 2, 00:10:22.177 "base_bdevs_list": [ 00:10:22.177 { 00:10:22.177 "name": "BaseBdev1", 00:10:22.177 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:22.177 "is_configured": true, 00:10:22.177 "data_offset": 0, 00:10:22.177 "data_size": 65536 00:10:22.177 }, 00:10:22.177 { 00:10:22.177 "name": "BaseBdev2", 00:10:22.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:22.177 "is_configured": false, 00:10:22.177 "data_offset": 0, 00:10:22.177 "data_size": 0 00:10:22.177 } 00:10:22.177 ] 00:10:22.177 }' 00:10:22.177 15:49:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:22.177 15:49:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:22.743 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:23.001 [2024-06-10 15:49:28.457644] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:23.001 [2024-06-10 15:49:28.457678] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca69f0 name Existed_Raid, state configuring 00:10:23.001 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:23.259 [2024-06-10 15:49:28.718367] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:23.259 [2024-06-10 15:49:28.719876] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:23.259 [2024-06-10 15:49:28.719906] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:23.259 15:49:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:23.517 15:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:23.517 "name": "Existed_Raid", 00:10:23.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:23.517 "strip_size_kb": 64, 00:10:23.517 "state": "configuring", 00:10:23.517 "raid_level": "concat", 00:10:23.517 "superblock": false, 00:10:23.517 "num_base_bdevs": 2, 00:10:23.517 "num_base_bdevs_discovered": 1, 00:10:23.517 "num_base_bdevs_operational": 2, 00:10:23.517 "base_bdevs_list": [ 00:10:23.517 { 00:10:23.517 "name": "BaseBdev1", 00:10:23.517 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:23.517 "is_configured": true, 00:10:23.517 "data_offset": 0, 00:10:23.517 "data_size": 65536 00:10:23.517 }, 00:10:23.517 { 00:10:23.517 "name": "BaseBdev2", 00:10:23.517 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:23.517 "is_configured": false, 00:10:23.517 "data_offset": 0, 00:10:23.517 "data_size": 0 00:10:23.517 } 00:10:23.517 ] 00:10:23.517 }' 00:10:23.517 15:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:23.517 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:24.454 [2024-06-10 15:49:29.844596] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:24.454 [2024-06-10 15:49:29.844635] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ca7770 00:10:24.454 [2024-06-10 15:49:29.844642] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 131072, blocklen 512 00:10:24.454 [2024-06-10 15:49:29.844834] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ca8eb0 00:10:24.454 [2024-06-10 15:49:29.844954] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ca7770 00:10:24.454 [2024-06-10 15:49:29.844973] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ca7770 00:10:24.454 [2024-06-10 15:49:29.845134] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:24.454 BaseBdev2 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:24.454 15:49:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:24.712 15:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:24.972 [ 00:10:24.972 { 00:10:24.972 "name": "BaseBdev2", 00:10:24.972 "aliases": [ 00:10:24.972 "fbd93066-914b-4c7b-9707-139d9a374ee1" 00:10:24.972 ], 00:10:24.972 "product_name": "Malloc disk", 00:10:24.972 "block_size": 512, 00:10:24.972 "num_blocks": 65536, 00:10:24.972 "uuid": "fbd93066-914b-4c7b-9707-139d9a374ee1", 00:10:24.972 "assigned_rate_limits": { 00:10:24.972 "rw_ios_per_sec": 0, 00:10:24.972 "rw_mbytes_per_sec": 0, 00:10:24.972 "r_mbytes_per_sec": 0, 00:10:24.972 "w_mbytes_per_sec": 0 00:10:24.972 }, 00:10:24.972 "claimed": true, 00:10:24.972 "claim_type": "exclusive_write", 00:10:24.972 "zoned": false, 00:10:24.972 "supported_io_types": { 00:10:24.972 "read": true, 00:10:24.972 "write": true, 00:10:24.972 "unmap": true, 00:10:24.972 "write_zeroes": true, 00:10:24.972 "flush": true, 00:10:24.972 "reset": true, 00:10:24.972 "compare": false, 00:10:24.972 "compare_and_write": false, 00:10:24.972 "abort": true, 00:10:24.972 "nvme_admin": false, 00:10:24.972 "nvme_io": false 00:10:24.972 }, 00:10:24.972 "memory_domains": [ 00:10:24.972 { 00:10:24.972 "dma_device_id": "system", 00:10:24.972 "dma_device_type": 1 00:10:24.972 }, 00:10:24.972 { 00:10:24.973 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:24.973 "dma_device_type": 2 00:10:24.973 } 00:10:24.973 ], 00:10:24.973 "driver_specific": {} 00:10:24.973 } 00:10:24.973 ] 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:24.973 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:25.232 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:25.232 "name": "Existed_Raid", 00:10:25.232 "uuid": "26357148-c0b4-4402-a162-4a771139bd93", 00:10:25.232 "strip_size_kb": 64, 00:10:25.232 "state": "online", 00:10:25.232 "raid_level": "concat", 00:10:25.232 "superblock": false, 00:10:25.232 "num_base_bdevs": 2, 00:10:25.232 "num_base_bdevs_discovered": 2, 00:10:25.232 "num_base_bdevs_operational": 2, 00:10:25.232 "base_bdevs_list": [ 00:10:25.232 { 00:10:25.232 "name": "BaseBdev1", 00:10:25.232 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:25.232 "is_configured": true, 00:10:25.232 "data_offset": 0, 00:10:25.232 "data_size": 65536 00:10:25.232 }, 00:10:25.232 { 00:10:25.232 "name": "BaseBdev2", 00:10:25.232 "uuid": "fbd93066-914b-4c7b-9707-139d9a374ee1", 00:10:25.232 "is_configured": true, 00:10:25.232 "data_offset": 0, 00:10:25.232 "data_size": 65536 00:10:25.232 } 00:10:25.232 ] 00:10:25.232 }' 00:10:25.232 15:49:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:25.232 15:49:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:25.799 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:26.058 [2024-06-10 15:49:31.493280] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:26.058 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:26.058 "name": "Existed_Raid", 00:10:26.058 "aliases": [ 00:10:26.058 "26357148-c0b4-4402-a162-4a771139bd93" 00:10:26.058 ], 00:10:26.058 "product_name": "Raid Volume", 00:10:26.058 "block_size": 512, 00:10:26.058 "num_blocks": 131072, 00:10:26.058 "uuid": "26357148-c0b4-4402-a162-4a771139bd93", 00:10:26.058 "assigned_rate_limits": { 00:10:26.058 "rw_ios_per_sec": 0, 00:10:26.058 "rw_mbytes_per_sec": 0, 00:10:26.058 "r_mbytes_per_sec": 0, 00:10:26.058 "w_mbytes_per_sec": 0 00:10:26.058 }, 00:10:26.058 "claimed": false, 00:10:26.058 "zoned": false, 00:10:26.058 "supported_io_types": { 00:10:26.058 "read": true, 00:10:26.058 "write": true, 00:10:26.058 "unmap": true, 00:10:26.058 "write_zeroes": true, 00:10:26.058 "flush": true, 00:10:26.058 "reset": true, 00:10:26.058 "compare": false, 00:10:26.058 "compare_and_write": false, 00:10:26.058 "abort": false, 00:10:26.058 "nvme_admin": false, 00:10:26.058 "nvme_io": false 00:10:26.058 }, 00:10:26.058 "memory_domains": [ 00:10:26.058 { 00:10:26.058 "dma_device_id": "system", 00:10:26.058 "dma_device_type": 1 00:10:26.058 }, 00:10:26.058 { 00:10:26.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.058 "dma_device_type": 2 00:10:26.058 }, 00:10:26.058 { 00:10:26.058 "dma_device_id": "system", 00:10:26.058 "dma_device_type": 1 00:10:26.058 }, 00:10:26.058 { 00:10:26.058 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.058 "dma_device_type": 2 00:10:26.058 } 00:10:26.058 ], 00:10:26.058 "driver_specific": { 00:10:26.058 "raid": { 00:10:26.058 "uuid": "26357148-c0b4-4402-a162-4a771139bd93", 00:10:26.058 "strip_size_kb": 64, 00:10:26.058 "state": "online", 00:10:26.058 "raid_level": "concat", 00:10:26.058 "superblock": false, 00:10:26.058 "num_base_bdevs": 2, 00:10:26.058 "num_base_bdevs_discovered": 2, 00:10:26.058 "num_base_bdevs_operational": 2, 00:10:26.058 "base_bdevs_list": [ 00:10:26.058 { 00:10:26.058 "name": "BaseBdev1", 00:10:26.058 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:26.058 "is_configured": true, 00:10:26.058 "data_offset": 0, 00:10:26.058 "data_size": 65536 00:10:26.058 }, 00:10:26.058 { 00:10:26.058 "name": "BaseBdev2", 00:10:26.058 "uuid": "fbd93066-914b-4c7b-9707-139d9a374ee1", 00:10:26.058 "is_configured": true, 00:10:26.058 "data_offset": 0, 00:10:26.058 "data_size": 65536 00:10:26.058 } 00:10:26.058 ] 00:10:26.058 } 00:10:26.058 } 00:10:26.058 }' 00:10:26.058 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:26.058 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:26.058 BaseBdev2' 00:10:26.058 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.058 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:26.317 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:26.317 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:26.317 "name": "BaseBdev1", 00:10:26.317 "aliases": [ 00:10:26.317 "bf872cb7-8eb7-45f8-bb28-8d8a397cf981" 00:10:26.317 ], 00:10:26.317 "product_name": "Malloc disk", 00:10:26.317 "block_size": 512, 00:10:26.317 "num_blocks": 65536, 00:10:26.317 "uuid": "bf872cb7-8eb7-45f8-bb28-8d8a397cf981", 00:10:26.317 "assigned_rate_limits": { 00:10:26.317 "rw_ios_per_sec": 0, 00:10:26.317 "rw_mbytes_per_sec": 0, 00:10:26.317 "r_mbytes_per_sec": 0, 00:10:26.317 "w_mbytes_per_sec": 0 00:10:26.317 }, 00:10:26.317 "claimed": true, 00:10:26.317 "claim_type": "exclusive_write", 00:10:26.317 "zoned": false, 00:10:26.317 "supported_io_types": { 00:10:26.317 "read": true, 00:10:26.317 "write": true, 00:10:26.317 "unmap": true, 00:10:26.317 "write_zeroes": true, 00:10:26.317 "flush": true, 00:10:26.317 "reset": true, 00:10:26.317 "compare": false, 00:10:26.317 "compare_and_write": false, 00:10:26.317 "abort": true, 00:10:26.317 "nvme_admin": false, 00:10:26.317 "nvme_io": false 00:10:26.317 }, 00:10:26.317 "memory_domains": [ 00:10:26.317 { 00:10:26.317 "dma_device_id": "system", 00:10:26.317 "dma_device_type": 1 00:10:26.317 }, 00:10:26.317 { 00:10:26.317 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:26.317 "dma_device_type": 2 00:10:26.317 } 00:10:26.317 ], 00:10:26.317 "driver_specific": {} 00:10:26.317 }' 00:10:26.317 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:26.576 15:49:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.576 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:26.576 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:26.576 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.834 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:26.834 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:26.834 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:26.834 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:26.834 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:27.092 "name": "BaseBdev2", 00:10:27.092 "aliases": [ 00:10:27.092 "fbd93066-914b-4c7b-9707-139d9a374ee1" 00:10:27.092 ], 00:10:27.092 "product_name": "Malloc disk", 00:10:27.092 "block_size": 512, 00:10:27.092 "num_blocks": 65536, 00:10:27.092 "uuid": "fbd93066-914b-4c7b-9707-139d9a374ee1", 00:10:27.092 "assigned_rate_limits": { 00:10:27.092 "rw_ios_per_sec": 0, 00:10:27.092 "rw_mbytes_per_sec": 0, 00:10:27.092 "r_mbytes_per_sec": 0, 00:10:27.092 "w_mbytes_per_sec": 0 00:10:27.092 }, 00:10:27.092 "claimed": true, 00:10:27.092 "claim_type": "exclusive_write", 00:10:27.092 "zoned": false, 00:10:27.092 "supported_io_types": { 00:10:27.092 "read": true, 00:10:27.092 "write": true, 00:10:27.092 "unmap": true, 00:10:27.092 "write_zeroes": true, 00:10:27.092 "flush": true, 00:10:27.092 "reset": true, 00:10:27.092 "compare": false, 00:10:27.092 "compare_and_write": false, 00:10:27.092 "abort": true, 00:10:27.092 "nvme_admin": false, 00:10:27.092 "nvme_io": false 00:10:27.092 }, 00:10:27.092 "memory_domains": [ 00:10:27.092 { 00:10:27.092 "dma_device_id": "system", 00:10:27.092 "dma_device_type": 1 00:10:27.092 }, 00:10:27.092 { 00:10:27.092 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:27.092 "dma_device_type": 2 00:10:27.092 } 00:10:27.092 ], 00:10:27.092 "driver_specific": {} 00:10:27.092 }' 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.092 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:27.350 15:49:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:27.609 [2024-06-10 15:49:33.037198] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:27.609 [2024-06-10 15:49:33.037223] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:27.609 [2024-06-10 15:49:33.037263] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:27.609 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:27.868 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:27.868 "name": "Existed_Raid", 00:10:27.868 "uuid": "26357148-c0b4-4402-a162-4a771139bd93", 00:10:27.868 "strip_size_kb": 64, 00:10:27.868 "state": "offline", 00:10:27.868 "raid_level": "concat", 00:10:27.868 "superblock": false, 00:10:27.868 "num_base_bdevs": 2, 00:10:27.868 "num_base_bdevs_discovered": 1, 00:10:27.868 "num_base_bdevs_operational": 1, 00:10:27.868 "base_bdevs_list": [ 00:10:27.868 { 00:10:27.868 "name": null, 00:10:27.868 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:27.868 "is_configured": false, 00:10:27.868 "data_offset": 0, 00:10:27.868 "data_size": 65536 00:10:27.868 }, 00:10:27.868 { 00:10:27.868 "name": "BaseBdev2", 00:10:27.868 "uuid": "fbd93066-914b-4c7b-9707-139d9a374ee1", 00:10:27.868 "is_configured": true, 00:10:27.868 "data_offset": 0, 00:10:27.869 "data_size": 65536 00:10:27.869 } 00:10:27.869 ] 00:10:27.869 }' 00:10:27.869 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:27.869 15:49:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:28.436 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:28.436 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:28.436 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.436 15:49:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:28.695 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:28.695 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:28.696 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:28.954 [2024-06-10 15:49:34.418164] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:28.954 [2024-06-10 15:49:34.418212] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ca7770 name Existed_Raid, state offline 00:10:28.954 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:28.954 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:28.954 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:28.954 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2638784 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2638784 ']' 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2638784 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:29.213 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2638784 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2638784' 00:10:29.473 killing process with pid 2638784 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2638784 00:10:29.473 [2024-06-10 15:49:34.749610] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2638784 00:10:29.473 [2024-06-10 15:49:34.750471] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:10:29.473 00:10:29.473 real 0m10.983s 00:10:29.473 user 0m20.007s 00:10:29.473 sys 0m1.604s 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:29.473 15:49:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:10:29.473 ************************************ 00:10:29.473 END TEST raid_state_function_test 00:10:29.473 ************************************ 00:10:29.732 15:49:34 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 2 true 00:10:29.732 15:49:34 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:29.732 15:49:34 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:29.732 15:49:34 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:29.732 ************************************ 00:10:29.732 START TEST raid_state_function_test_sb 00:10:29.732 ************************************ 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 2 true 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2640834 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2640834' 00:10:29.732 Process raid pid: 2640834 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2640834 /var/tmp/spdk-raid.sock 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2640834 ']' 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:29.732 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:29.732 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:29.732 [2024-06-10 15:49:35.082303] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:29.732 [2024-06-10 15:49:35.082355] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:10:29.732 [2024-06-10 15:49:35.172816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:29.993 [2024-06-10 15:49:35.267425] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:29.993 [2024-06-10 15:49:35.324504] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.993 [2024-06-10 15:49:35.324531] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:29.994 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:29.994 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:10:29.994 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:30.253 [2024-06-10 15:49:35.613954] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:30.253 [2024-06-10 15:49:35.613992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:30.253 [2024-06-10 15:49:35.614002] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:30.253 [2024-06-10 15:49:35.614011] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:30.253 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:30.512 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:30.512 "name": "Existed_Raid", 00:10:30.512 "uuid": "738c0847-8a8b-4d39-acff-e8c4d63f9d52", 00:10:30.512 "strip_size_kb": 64, 00:10:30.512 "state": "configuring", 00:10:30.512 "raid_level": "concat", 00:10:30.512 "superblock": true, 00:10:30.512 "num_base_bdevs": 2, 00:10:30.512 "num_base_bdevs_discovered": 0, 00:10:30.512 "num_base_bdevs_operational": 2, 00:10:30.512 "base_bdevs_list": [ 00:10:30.512 { 00:10:30.512 "name": "BaseBdev1", 00:10:30.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.512 "is_configured": false, 00:10:30.512 "data_offset": 0, 00:10:30.512 "data_size": 0 00:10:30.512 }, 00:10:30.512 { 00:10:30.512 "name": "BaseBdev2", 00:10:30.512 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:30.512 "is_configured": false, 00:10:30.512 "data_offset": 0, 00:10:30.512 "data_size": 0 00:10:30.512 } 00:10:30.512 ] 00:10:30.512 }' 00:10:30.512 15:49:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:30.512 15:49:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:31.079 15:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:31.337 [2024-06-10 15:49:36.732791] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:31.338 [2024-06-10 15:49:36.732817] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bba120 name Existed_Raid, state configuring 00:10:31.338 15:49:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:31.596 [2024-06-10 15:49:36.993501] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:10:31.596 [2024-06-10 15:49:36.993526] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:10:31.596 [2024-06-10 15:49:36.993535] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:31.596 [2024-06-10 15:49:36.993543] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:31.596 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:10:31.854 [2024-06-10 15:49:37.259649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:31.854 BaseBdev1 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:31.854 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:32.113 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:10:32.371 [ 00:10:32.371 { 00:10:32.371 "name": "BaseBdev1", 00:10:32.371 "aliases": [ 00:10:32.371 "88834815-b51f-4662-94f5-f194ac220afd" 00:10:32.371 ], 00:10:32.371 "product_name": "Malloc disk", 00:10:32.371 "block_size": 512, 00:10:32.371 "num_blocks": 65536, 00:10:32.372 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:32.372 "assigned_rate_limits": { 00:10:32.372 "rw_ios_per_sec": 0, 00:10:32.372 "rw_mbytes_per_sec": 0, 00:10:32.372 "r_mbytes_per_sec": 0, 00:10:32.372 "w_mbytes_per_sec": 0 00:10:32.372 }, 00:10:32.372 "claimed": true, 00:10:32.372 "claim_type": "exclusive_write", 00:10:32.372 "zoned": false, 00:10:32.372 "supported_io_types": { 00:10:32.372 "read": true, 00:10:32.372 "write": true, 00:10:32.372 "unmap": true, 00:10:32.372 "write_zeroes": true, 00:10:32.372 "flush": true, 00:10:32.372 "reset": true, 00:10:32.372 "compare": false, 00:10:32.372 "compare_and_write": false, 00:10:32.372 "abort": true, 00:10:32.372 "nvme_admin": false, 00:10:32.372 "nvme_io": false 00:10:32.372 }, 00:10:32.372 "memory_domains": [ 00:10:32.372 { 00:10:32.372 "dma_device_id": "system", 00:10:32.372 "dma_device_type": 1 00:10:32.372 }, 00:10:32.372 { 00:10:32.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:32.372 "dma_device_type": 2 00:10:32.372 } 00:10:32.372 ], 00:10:32.372 "driver_specific": {} 00:10:32.372 } 00:10:32.372 ] 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:32.372 15:49:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:32.630 15:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:32.630 "name": "Existed_Raid", 00:10:32.630 "uuid": "4c8da883-289f-455f-a994-6cb803de83b6", 00:10:32.630 "strip_size_kb": 64, 00:10:32.630 "state": "configuring", 00:10:32.630 "raid_level": "concat", 00:10:32.630 "superblock": true, 00:10:32.630 "num_base_bdevs": 2, 00:10:32.630 "num_base_bdevs_discovered": 1, 00:10:32.630 "num_base_bdevs_operational": 2, 00:10:32.630 "base_bdevs_list": [ 00:10:32.630 { 00:10:32.630 "name": "BaseBdev1", 00:10:32.630 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:32.630 "is_configured": true, 00:10:32.630 "data_offset": 2048, 00:10:32.630 "data_size": 63488 00:10:32.630 }, 00:10:32.630 { 00:10:32.630 "name": "BaseBdev2", 00:10:32.630 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:32.630 "is_configured": false, 00:10:32.630 "data_offset": 0, 00:10:32.630 "data_size": 0 00:10:32.630 } 00:10:32.630 ] 00:10:32.630 }' 00:10:32.630 15:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:32.630 15:49:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:33.198 15:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:10:33.489 [2024-06-10 15:49:38.908076] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:10:33.489 [2024-06-10 15:49:38.908114] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bb99f0 name Existed_Raid, state configuring 00:10:33.489 15:49:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:10:33.760 [2024-06-10 15:49:39.164800] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:33.760 [2024-06-10 15:49:39.166330] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:10:33.760 [2024-06-10 15:49:39.166360] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:10:33.760 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:10:33.760 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:33.760 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 2 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:33.761 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:34.019 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:34.019 "name": "Existed_Raid", 00:10:34.019 "uuid": "8df0d493-af33-4ff3-abaf-c8bf854855dc", 00:10:34.019 "strip_size_kb": 64, 00:10:34.019 "state": "configuring", 00:10:34.019 "raid_level": "concat", 00:10:34.019 "superblock": true, 00:10:34.019 "num_base_bdevs": 2, 00:10:34.019 "num_base_bdevs_discovered": 1, 00:10:34.019 "num_base_bdevs_operational": 2, 00:10:34.019 "base_bdevs_list": [ 00:10:34.019 { 00:10:34.019 "name": "BaseBdev1", 00:10:34.019 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:34.019 "is_configured": true, 00:10:34.019 "data_offset": 2048, 00:10:34.019 "data_size": 63488 00:10:34.019 }, 00:10:34.019 { 00:10:34.019 "name": "BaseBdev2", 00:10:34.019 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:34.019 "is_configured": false, 00:10:34.019 "data_offset": 0, 00:10:34.019 "data_size": 0 00:10:34.019 } 00:10:34.019 ] 00:10:34.019 }' 00:10:34.019 15:49:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:34.019 15:49:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:34.587 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:10:34.846 [2024-06-10 15:49:40.291114] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:34.846 [2024-06-10 15:49:40.291258] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1bba770 00:10:34.846 [2024-06-10 15:49:40.291269] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:34.846 [2024-06-10 15:49:40.291447] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1bbbeb0 00:10:34.846 [2024-06-10 15:49:40.291562] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1bba770 00:10:34.846 [2024-06-10 15:49:40.291570] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1bba770 00:10:34.846 [2024-06-10 15:49:40.291661] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:34.846 BaseBdev2 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:10:34.846 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:10:35.105 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:10:35.364 [ 00:10:35.364 { 00:10:35.364 "name": "BaseBdev2", 00:10:35.364 "aliases": [ 00:10:35.364 "21961516-b248-4f85-b422-707b970075c2" 00:10:35.364 ], 00:10:35.364 "product_name": "Malloc disk", 00:10:35.364 "block_size": 512, 00:10:35.364 "num_blocks": 65536, 00:10:35.364 "uuid": "21961516-b248-4f85-b422-707b970075c2", 00:10:35.364 "assigned_rate_limits": { 00:10:35.364 "rw_ios_per_sec": 0, 00:10:35.364 "rw_mbytes_per_sec": 0, 00:10:35.364 "r_mbytes_per_sec": 0, 00:10:35.364 "w_mbytes_per_sec": 0 00:10:35.364 }, 00:10:35.364 "claimed": true, 00:10:35.364 "claim_type": "exclusive_write", 00:10:35.364 "zoned": false, 00:10:35.364 "supported_io_types": { 00:10:35.364 "read": true, 00:10:35.364 "write": true, 00:10:35.364 "unmap": true, 00:10:35.364 "write_zeroes": true, 00:10:35.364 "flush": true, 00:10:35.364 "reset": true, 00:10:35.364 "compare": false, 00:10:35.364 "compare_and_write": false, 00:10:35.364 "abort": true, 00:10:35.364 "nvme_admin": false, 00:10:35.364 "nvme_io": false 00:10:35.364 }, 00:10:35.364 "memory_domains": [ 00:10:35.364 { 00:10:35.364 "dma_device_id": "system", 00:10:35.364 "dma_device_type": 1 00:10:35.364 }, 00:10:35.364 { 00:10:35.364 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:35.364 "dma_device_type": 2 00:10:35.364 } 00:10:35.364 ], 00:10:35.364 "driver_specific": {} 00:10:35.364 } 00:10:35.364 ] 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 2 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:35.364 15:49:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:35.623 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:35.623 "name": "Existed_Raid", 00:10:35.623 "uuid": "8df0d493-af33-4ff3-abaf-c8bf854855dc", 00:10:35.623 "strip_size_kb": 64, 00:10:35.623 "state": "online", 00:10:35.623 "raid_level": "concat", 00:10:35.623 "superblock": true, 00:10:35.623 "num_base_bdevs": 2, 00:10:35.623 "num_base_bdevs_discovered": 2, 00:10:35.623 "num_base_bdevs_operational": 2, 00:10:35.623 "base_bdevs_list": [ 00:10:35.623 { 00:10:35.623 "name": "BaseBdev1", 00:10:35.623 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:35.623 "is_configured": true, 00:10:35.623 "data_offset": 2048, 00:10:35.623 "data_size": 63488 00:10:35.623 }, 00:10:35.623 { 00:10:35.623 "name": "BaseBdev2", 00:10:35.623 "uuid": "21961516-b248-4f85-b422-707b970075c2", 00:10:35.623 "is_configured": true, 00:10:35.623 "data_offset": 2048, 00:10:35.623 "data_size": 63488 00:10:35.623 } 00:10:35.623 ] 00:10:35.623 }' 00:10:35.623 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:35.623 15:49:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:36.192 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:10:36.450 [2024-06-10 15:49:41.891656] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:36.450 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:36.451 "name": "Existed_Raid", 00:10:36.451 "aliases": [ 00:10:36.451 "8df0d493-af33-4ff3-abaf-c8bf854855dc" 00:10:36.451 ], 00:10:36.451 "product_name": "Raid Volume", 00:10:36.451 "block_size": 512, 00:10:36.451 "num_blocks": 126976, 00:10:36.451 "uuid": "8df0d493-af33-4ff3-abaf-c8bf854855dc", 00:10:36.451 "assigned_rate_limits": { 00:10:36.451 "rw_ios_per_sec": 0, 00:10:36.451 "rw_mbytes_per_sec": 0, 00:10:36.451 "r_mbytes_per_sec": 0, 00:10:36.451 "w_mbytes_per_sec": 0 00:10:36.451 }, 00:10:36.451 "claimed": false, 00:10:36.451 "zoned": false, 00:10:36.451 "supported_io_types": { 00:10:36.451 "read": true, 00:10:36.451 "write": true, 00:10:36.451 "unmap": true, 00:10:36.451 "write_zeroes": true, 00:10:36.451 "flush": true, 00:10:36.451 "reset": true, 00:10:36.451 "compare": false, 00:10:36.451 "compare_and_write": false, 00:10:36.451 "abort": false, 00:10:36.451 "nvme_admin": false, 00:10:36.451 "nvme_io": false 00:10:36.451 }, 00:10:36.451 "memory_domains": [ 00:10:36.451 { 00:10:36.451 "dma_device_id": "system", 00:10:36.451 "dma_device_type": 1 00:10:36.451 }, 00:10:36.451 { 00:10:36.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.451 "dma_device_type": 2 00:10:36.451 }, 00:10:36.451 { 00:10:36.451 "dma_device_id": "system", 00:10:36.451 "dma_device_type": 1 00:10:36.451 }, 00:10:36.451 { 00:10:36.451 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.451 "dma_device_type": 2 00:10:36.451 } 00:10:36.451 ], 00:10:36.451 "driver_specific": { 00:10:36.451 "raid": { 00:10:36.451 "uuid": "8df0d493-af33-4ff3-abaf-c8bf854855dc", 00:10:36.451 "strip_size_kb": 64, 00:10:36.451 "state": "online", 00:10:36.451 "raid_level": "concat", 00:10:36.451 "superblock": true, 00:10:36.451 "num_base_bdevs": 2, 00:10:36.451 "num_base_bdevs_discovered": 2, 00:10:36.451 "num_base_bdevs_operational": 2, 00:10:36.451 "base_bdevs_list": [ 00:10:36.451 { 00:10:36.451 "name": "BaseBdev1", 00:10:36.451 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:36.451 "is_configured": true, 00:10:36.451 "data_offset": 2048, 00:10:36.451 "data_size": 63488 00:10:36.451 }, 00:10:36.451 { 00:10:36.451 "name": "BaseBdev2", 00:10:36.451 "uuid": "21961516-b248-4f85-b422-707b970075c2", 00:10:36.451 "is_configured": true, 00:10:36.451 "data_offset": 2048, 00:10:36.451 "data_size": 63488 00:10:36.451 } 00:10:36.451 ] 00:10:36.451 } 00:10:36.451 } 00:10:36.451 }' 00:10:36.451 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:36.710 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:10:36.710 BaseBdev2' 00:10:36.710 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:36.710 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:36.710 15:49:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:36.970 "name": "BaseBdev1", 00:10:36.970 "aliases": [ 00:10:36.970 "88834815-b51f-4662-94f5-f194ac220afd" 00:10:36.970 ], 00:10:36.970 "product_name": "Malloc disk", 00:10:36.970 "block_size": 512, 00:10:36.970 "num_blocks": 65536, 00:10:36.970 "uuid": "88834815-b51f-4662-94f5-f194ac220afd", 00:10:36.970 "assigned_rate_limits": { 00:10:36.970 "rw_ios_per_sec": 0, 00:10:36.970 "rw_mbytes_per_sec": 0, 00:10:36.970 "r_mbytes_per_sec": 0, 00:10:36.970 "w_mbytes_per_sec": 0 00:10:36.970 }, 00:10:36.970 "claimed": true, 00:10:36.970 "claim_type": "exclusive_write", 00:10:36.970 "zoned": false, 00:10:36.970 "supported_io_types": { 00:10:36.970 "read": true, 00:10:36.970 "write": true, 00:10:36.970 "unmap": true, 00:10:36.970 "write_zeroes": true, 00:10:36.970 "flush": true, 00:10:36.970 "reset": true, 00:10:36.970 "compare": false, 00:10:36.970 "compare_and_write": false, 00:10:36.970 "abort": true, 00:10:36.970 "nvme_admin": false, 00:10:36.970 "nvme_io": false 00:10:36.970 }, 00:10:36.970 "memory_domains": [ 00:10:36.970 { 00:10:36.970 "dma_device_id": "system", 00:10:36.970 "dma_device_type": 1 00:10:36.970 }, 00:10:36.970 { 00:10:36.970 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:36.970 "dma_device_type": 2 00:10:36.970 } 00:10:36.970 ], 00:10:36.970 "driver_specific": {} 00:10:36.970 }' 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:36.970 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:10:37.230 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:37.489 "name": "BaseBdev2", 00:10:37.489 "aliases": [ 00:10:37.489 "21961516-b248-4f85-b422-707b970075c2" 00:10:37.489 ], 00:10:37.489 "product_name": "Malloc disk", 00:10:37.489 "block_size": 512, 00:10:37.489 "num_blocks": 65536, 00:10:37.489 "uuid": "21961516-b248-4f85-b422-707b970075c2", 00:10:37.489 "assigned_rate_limits": { 00:10:37.489 "rw_ios_per_sec": 0, 00:10:37.489 "rw_mbytes_per_sec": 0, 00:10:37.489 "r_mbytes_per_sec": 0, 00:10:37.489 "w_mbytes_per_sec": 0 00:10:37.489 }, 00:10:37.489 "claimed": true, 00:10:37.489 "claim_type": "exclusive_write", 00:10:37.489 "zoned": false, 00:10:37.489 "supported_io_types": { 00:10:37.489 "read": true, 00:10:37.489 "write": true, 00:10:37.489 "unmap": true, 00:10:37.489 "write_zeroes": true, 00:10:37.489 "flush": true, 00:10:37.489 "reset": true, 00:10:37.489 "compare": false, 00:10:37.489 "compare_and_write": false, 00:10:37.489 "abort": true, 00:10:37.489 "nvme_admin": false, 00:10:37.489 "nvme_io": false 00:10:37.489 }, 00:10:37.489 "memory_domains": [ 00:10:37.489 { 00:10:37.489 "dma_device_id": "system", 00:10:37.489 "dma_device_type": 1 00:10:37.489 }, 00:10:37.489 { 00:10:37.489 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:37.489 "dma_device_type": 2 00:10:37.489 } 00:10:37.489 ], 00:10:37.489 "driver_specific": {} 00:10:37.489 }' 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:37.489 15:49:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:37.749 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:10:38.008 [2024-06-10 15:49:43.419555] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:10:38.008 [2024-06-10 15:49:43.419580] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:38.008 [2024-06-10 15:49:43.419619] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 1 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.008 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:10:38.267 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:38.267 "name": "Existed_Raid", 00:10:38.267 "uuid": "8df0d493-af33-4ff3-abaf-c8bf854855dc", 00:10:38.267 "strip_size_kb": 64, 00:10:38.267 "state": "offline", 00:10:38.267 "raid_level": "concat", 00:10:38.267 "superblock": true, 00:10:38.267 "num_base_bdevs": 2, 00:10:38.267 "num_base_bdevs_discovered": 1, 00:10:38.267 "num_base_bdevs_operational": 1, 00:10:38.267 "base_bdevs_list": [ 00:10:38.267 { 00:10:38.267 "name": null, 00:10:38.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:10:38.267 "is_configured": false, 00:10:38.267 "data_offset": 2048, 00:10:38.267 "data_size": 63488 00:10:38.267 }, 00:10:38.267 { 00:10:38.267 "name": "BaseBdev2", 00:10:38.267 "uuid": "21961516-b248-4f85-b422-707b970075c2", 00:10:38.267 "is_configured": true, 00:10:38.267 "data_offset": 2048, 00:10:38.267 "data_size": 63488 00:10:38.267 } 00:10:38.267 ] 00:10:38.267 }' 00:10:38.267 15:49:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:38.267 15:49:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:38.835 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:10:38.835 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:38.835 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:38.835 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:10:39.094 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:10:39.094 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:10:39.094 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:10:39.353 [2024-06-10 15:49:44.788410] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:10:39.353 [2024-06-10 15:49:44.788459] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1bba770 name Existed_Raid, state offline 00:10:39.353 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:10:39.353 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:10:39.353 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:39.353 15:49:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:10:39.612 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:10:39.612 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2640834 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2640834 ']' 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2640834 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:39.613 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2640834 00:10:39.871 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:39.871 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2640834' 00:10:39.872 killing process with pid 2640834 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2640834 00:10:39.872 [2024-06-10 15:49:45.125001] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2640834 00:10:39.872 [2024-06-10 15:49:45.125851] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:10:39.872 00:10:39.872 real 0m10.303s 00:10:39.872 user 0m19.107s 00:10:39.872 sys 0m1.557s 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:39.872 15:49:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:10:39.872 ************************************ 00:10:39.872 END TEST raid_state_function_test_sb 00:10:39.872 ************************************ 00:10:39.872 15:49:45 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 2 00:10:39.872 15:49:45 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:10:39.872 15:49:45 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:39.872 15:49:45 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:40.131 ************************************ 00:10:40.131 START TEST raid_superblock_test 00:10:40.131 ************************************ 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 2 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2642871 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2642871 /var/tmp/spdk-raid.sock 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2642871 ']' 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:40.131 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:40.131 15:49:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:40.131 [2024-06-10 15:49:45.455322] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:40.131 [2024-06-10 15:49:45.455376] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2642871 ] 00:10:40.131 [2024-06-10 15:49:45.552313] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:40.390 [2024-06-10 15:49:45.646937] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:40.390 [2024-06-10 15:49:45.704211] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.390 [2024-06-10 15:49:45.704243] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:40.959 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:10:41.218 malloc1 00:10:41.218 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:41.477 [2024-06-10 15:49:46.909921] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:41.477 [2024-06-10 15:49:46.909972] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:41.477 [2024-06-10 15:49:46.909990] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x14800f0 00:10:41.477 [2024-06-10 15:49:46.910001] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:41.477 [2024-06-10 15:49:46.911707] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:41.477 [2024-06-10 15:49:46.911733] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:41.477 pt1 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:10:41.477 15:49:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:10:41.736 malloc2 00:10:41.737 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:41.996 [2024-06-10 15:49:47.420048] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:41.996 [2024-06-10 15:49:47.420089] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:41.996 [2024-06-10 15:49:47.420104] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1481400 00:10:41.996 [2024-06-10 15:49:47.420114] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:41.996 [2024-06-10 15:49:47.421649] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:41.996 [2024-06-10 15:49:47.421675] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:41.996 pt2 00:10:41.996 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:10:41.996 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:10:41.996 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2' -n raid_bdev1 -s 00:10:42.255 [2024-06-10 15:49:47.672736] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:42.255 [2024-06-10 15:49:47.674076] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:42.255 [2024-06-10 15:49:47.674222] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x162ce60 00:10:42.255 [2024-06-10 15:49:47.674234] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:42.255 [2024-06-10 15:49:47.674428] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1496fe0 00:10:42.255 [2024-06-10 15:49:47.674571] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x162ce60 00:10:42.255 [2024-06-10 15:49:47.674580] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x162ce60 00:10:42.255 [2024-06-10 15:49:47.674676] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:42.255 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:42.514 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:42.514 "name": "raid_bdev1", 00:10:42.514 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:42.514 "strip_size_kb": 64, 00:10:42.514 "state": "online", 00:10:42.514 "raid_level": "concat", 00:10:42.514 "superblock": true, 00:10:42.514 "num_base_bdevs": 2, 00:10:42.514 "num_base_bdevs_discovered": 2, 00:10:42.514 "num_base_bdevs_operational": 2, 00:10:42.514 "base_bdevs_list": [ 00:10:42.514 { 00:10:42.514 "name": "pt1", 00:10:42.514 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:42.514 "is_configured": true, 00:10:42.514 "data_offset": 2048, 00:10:42.514 "data_size": 63488 00:10:42.514 }, 00:10:42.514 { 00:10:42.514 "name": "pt2", 00:10:42.514 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:42.514 "is_configured": true, 00:10:42.514 "data_offset": 2048, 00:10:42.514 "data_size": 63488 00:10:42.514 } 00:10:42.514 ] 00:10:42.514 }' 00:10:42.514 15:49:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:42.514 15:49:47 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:43.082 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:43.341 [2024-06-10 15:49:48.795970] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:43.341 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:43.341 "name": "raid_bdev1", 00:10:43.341 "aliases": [ 00:10:43.341 "19191096-cb2a-4b3b-9ecf-d404bb58df1f" 00:10:43.341 ], 00:10:43.341 "product_name": "Raid Volume", 00:10:43.341 "block_size": 512, 00:10:43.341 "num_blocks": 126976, 00:10:43.341 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:43.341 "assigned_rate_limits": { 00:10:43.341 "rw_ios_per_sec": 0, 00:10:43.341 "rw_mbytes_per_sec": 0, 00:10:43.341 "r_mbytes_per_sec": 0, 00:10:43.341 "w_mbytes_per_sec": 0 00:10:43.341 }, 00:10:43.341 "claimed": false, 00:10:43.341 "zoned": false, 00:10:43.341 "supported_io_types": { 00:10:43.341 "read": true, 00:10:43.341 "write": true, 00:10:43.341 "unmap": true, 00:10:43.341 "write_zeroes": true, 00:10:43.341 "flush": true, 00:10:43.341 "reset": true, 00:10:43.341 "compare": false, 00:10:43.341 "compare_and_write": false, 00:10:43.341 "abort": false, 00:10:43.341 "nvme_admin": false, 00:10:43.341 "nvme_io": false 00:10:43.341 }, 00:10:43.341 "memory_domains": [ 00:10:43.341 { 00:10:43.341 "dma_device_id": "system", 00:10:43.341 "dma_device_type": 1 00:10:43.341 }, 00:10:43.341 { 00:10:43.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.341 "dma_device_type": 2 00:10:43.341 }, 00:10:43.341 { 00:10:43.341 "dma_device_id": "system", 00:10:43.341 "dma_device_type": 1 00:10:43.341 }, 00:10:43.341 { 00:10:43.341 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.341 "dma_device_type": 2 00:10:43.341 } 00:10:43.341 ], 00:10:43.341 "driver_specific": { 00:10:43.341 "raid": { 00:10:43.341 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:43.341 "strip_size_kb": 64, 00:10:43.341 "state": "online", 00:10:43.341 "raid_level": "concat", 00:10:43.341 "superblock": true, 00:10:43.341 "num_base_bdevs": 2, 00:10:43.341 "num_base_bdevs_discovered": 2, 00:10:43.341 "num_base_bdevs_operational": 2, 00:10:43.341 "base_bdevs_list": [ 00:10:43.341 { 00:10:43.341 "name": "pt1", 00:10:43.341 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:43.341 "is_configured": true, 00:10:43.341 "data_offset": 2048, 00:10:43.341 "data_size": 63488 00:10:43.341 }, 00:10:43.341 { 00:10:43.341 "name": "pt2", 00:10:43.341 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:43.341 "is_configured": true, 00:10:43.341 "data_offset": 2048, 00:10:43.341 "data_size": 63488 00:10:43.341 } 00:10:43.341 ] 00:10:43.341 } 00:10:43.341 } 00:10:43.341 }' 00:10:43.341 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:43.601 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:43.601 pt2' 00:10:43.601 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:43.601 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:43.601 15:49:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:43.860 "name": "pt1", 00:10:43.860 "aliases": [ 00:10:43.860 "00000000-0000-0000-0000-000000000001" 00:10:43.860 ], 00:10:43.860 "product_name": "passthru", 00:10:43.860 "block_size": 512, 00:10:43.860 "num_blocks": 65536, 00:10:43.860 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:43.860 "assigned_rate_limits": { 00:10:43.860 "rw_ios_per_sec": 0, 00:10:43.860 "rw_mbytes_per_sec": 0, 00:10:43.860 "r_mbytes_per_sec": 0, 00:10:43.860 "w_mbytes_per_sec": 0 00:10:43.860 }, 00:10:43.860 "claimed": true, 00:10:43.860 "claim_type": "exclusive_write", 00:10:43.860 "zoned": false, 00:10:43.860 "supported_io_types": { 00:10:43.860 "read": true, 00:10:43.860 "write": true, 00:10:43.860 "unmap": true, 00:10:43.860 "write_zeroes": true, 00:10:43.860 "flush": true, 00:10:43.860 "reset": true, 00:10:43.860 "compare": false, 00:10:43.860 "compare_and_write": false, 00:10:43.860 "abort": true, 00:10:43.860 "nvme_admin": false, 00:10:43.860 "nvme_io": false 00:10:43.860 }, 00:10:43.860 "memory_domains": [ 00:10:43.860 { 00:10:43.860 "dma_device_id": "system", 00:10:43.860 "dma_device_type": 1 00:10:43.860 }, 00:10:43.860 { 00:10:43.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:43.860 "dma_device_type": 2 00:10:43.860 } 00:10:43.860 ], 00:10:43.860 "driver_specific": { 00:10:43.860 "passthru": { 00:10:43.860 "name": "pt1", 00:10:43.860 "base_bdev_name": "malloc1" 00:10:43.860 } 00:10:43.860 } 00:10:43.860 }' 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:43.860 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:44.119 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:44.378 "name": "pt2", 00:10:44.378 "aliases": [ 00:10:44.378 "00000000-0000-0000-0000-000000000002" 00:10:44.378 ], 00:10:44.378 "product_name": "passthru", 00:10:44.378 "block_size": 512, 00:10:44.378 "num_blocks": 65536, 00:10:44.378 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:44.378 "assigned_rate_limits": { 00:10:44.378 "rw_ios_per_sec": 0, 00:10:44.378 "rw_mbytes_per_sec": 0, 00:10:44.378 "r_mbytes_per_sec": 0, 00:10:44.378 "w_mbytes_per_sec": 0 00:10:44.378 }, 00:10:44.378 "claimed": true, 00:10:44.378 "claim_type": "exclusive_write", 00:10:44.378 "zoned": false, 00:10:44.378 "supported_io_types": { 00:10:44.378 "read": true, 00:10:44.378 "write": true, 00:10:44.378 "unmap": true, 00:10:44.378 "write_zeroes": true, 00:10:44.378 "flush": true, 00:10:44.378 "reset": true, 00:10:44.378 "compare": false, 00:10:44.378 "compare_and_write": false, 00:10:44.378 "abort": true, 00:10:44.378 "nvme_admin": false, 00:10:44.378 "nvme_io": false 00:10:44.378 }, 00:10:44.378 "memory_domains": [ 00:10:44.378 { 00:10:44.378 "dma_device_id": "system", 00:10:44.378 "dma_device_type": 1 00:10:44.378 }, 00:10:44.378 { 00:10:44.378 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:44.378 "dma_device_type": 2 00:10:44.378 } 00:10:44.378 ], 00:10:44.378 "driver_specific": { 00:10:44.378 "passthru": { 00:10:44.378 "name": "pt2", 00:10:44.378 "base_bdev_name": "malloc2" 00:10:44.378 } 00:10:44.378 } 00:10:44.378 }' 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.378 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:44.637 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:44.637 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.637 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:44.637 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:44.637 15:49:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.637 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:44.637 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:44.637 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:44.637 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:10:44.896 [2024-06-10 15:49:50.308078] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:44.896 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=19191096-cb2a-4b3b-9ecf-d404bb58df1f 00:10:44.896 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 19191096-cb2a-4b3b-9ecf-d404bb58df1f ']' 00:10:44.896 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:45.155 [2024-06-10 15:49:50.568514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:45.155 [2024-06-10 15:49:50.568535] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:45.155 [2024-06-10 15:49:50.568586] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:45.155 [2024-06-10 15:49:50.568630] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:45.155 [2024-06-10 15:49:50.568638] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162ce60 name raid_bdev1, state offline 00:10:45.155 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:45.155 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:10:45.414 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:10:45.414 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:10:45.414 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:45.414 15:49:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:10:45.673 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:10:45.673 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:10:45.931 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:10:45.931 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:10:46.190 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2' -n raid_bdev1 00:10:46.449 [2024-06-10 15:49:51.851880] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:10:46.449 [2024-06-10 15:49:51.853308] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:10:46.449 [2024-06-10 15:49:51.853363] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:10:46.449 [2024-06-10 15:49:51.853400] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:10:46.449 [2024-06-10 15:49:51.853416] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:46.449 [2024-06-10 15:49:51.853424] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x162a520 name raid_bdev1, state configuring 00:10:46.449 request: 00:10:46.449 { 00:10:46.449 "name": "raid_bdev1", 00:10:46.449 "raid_level": "concat", 00:10:46.449 "base_bdevs": [ 00:10:46.449 "malloc1", 00:10:46.449 "malloc2" 00:10:46.449 ], 00:10:46.449 "superblock": false, 00:10:46.449 "strip_size_kb": 64, 00:10:46.449 "method": "bdev_raid_create", 00:10:46.449 "req_id": 1 00:10:46.449 } 00:10:46.449 Got JSON-RPC error response 00:10:46.449 response: 00:10:46.449 { 00:10:46.449 "code": -17, 00:10:46.449 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:10:46.449 } 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:46.449 15:49:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:10:46.708 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:10:46.708 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:10:46.708 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:10:46.967 [2024-06-10 15:49:52.353152] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:10:46.967 [2024-06-10 15:49:52.353186] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:46.967 [2024-06-10 15:49:52.353202] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16293a0 00:10:46.967 [2024-06-10 15:49:52.353216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:46.967 [2024-06-10 15:49:52.354830] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:46.967 [2024-06-10 15:49:52.354856] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:10:46.967 [2024-06-10 15:49:52.354914] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:10:46.967 [2024-06-10 15:49:52.354937] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:10:46.967 pt1 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 2 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:46.967 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:47.226 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:47.226 "name": "raid_bdev1", 00:10:47.226 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:47.226 "strip_size_kb": 64, 00:10:47.226 "state": "configuring", 00:10:47.226 "raid_level": "concat", 00:10:47.226 "superblock": true, 00:10:47.226 "num_base_bdevs": 2, 00:10:47.226 "num_base_bdevs_discovered": 1, 00:10:47.226 "num_base_bdevs_operational": 2, 00:10:47.226 "base_bdevs_list": [ 00:10:47.226 { 00:10:47.226 "name": "pt1", 00:10:47.226 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:47.226 "is_configured": true, 00:10:47.226 "data_offset": 2048, 00:10:47.226 "data_size": 63488 00:10:47.226 }, 00:10:47.226 { 00:10:47.226 "name": null, 00:10:47.226 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:47.226 "is_configured": false, 00:10:47.226 "data_offset": 2048, 00:10:47.226 "data_size": 63488 00:10:47.226 } 00:10:47.226 ] 00:10:47.226 }' 00:10:47.226 15:49:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:47.226 15:49:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:47.793 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:10:47.793 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:10:47.793 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:47.793 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:10:48.052 [2024-06-10 15:49:53.480175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:10:48.052 [2024-06-10 15:49:53.480219] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:48.052 [2024-06-10 15:49:53.480234] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x147e960 00:10:48.052 [2024-06-10 15:49:53.480243] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:48.052 [2024-06-10 15:49:53.480571] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:48.052 [2024-06-10 15:49:53.480586] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:10:48.052 [2024-06-10 15:49:53.480641] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:10:48.052 [2024-06-10 15:49:53.480657] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:10:48.052 [2024-06-10 15:49:53.480756] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x147f300 00:10:48.052 [2024-06-10 15:49:53.480764] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:48.052 [2024-06-10 15:49:53.480944] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1480c70 00:10:48.052 [2024-06-10 15:49:53.481079] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x147f300 00:10:48.052 [2024-06-10 15:49:53.481088] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x147f300 00:10:48.052 [2024-06-10 15:49:53.481190] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:48.052 pt2 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:48.052 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:48.311 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:48.311 "name": "raid_bdev1", 00:10:48.311 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:48.311 "strip_size_kb": 64, 00:10:48.311 "state": "online", 00:10:48.311 "raid_level": "concat", 00:10:48.311 "superblock": true, 00:10:48.311 "num_base_bdevs": 2, 00:10:48.311 "num_base_bdevs_discovered": 2, 00:10:48.311 "num_base_bdevs_operational": 2, 00:10:48.311 "base_bdevs_list": [ 00:10:48.311 { 00:10:48.311 "name": "pt1", 00:10:48.311 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:48.311 "is_configured": true, 00:10:48.311 "data_offset": 2048, 00:10:48.311 "data_size": 63488 00:10:48.311 }, 00:10:48.311 { 00:10:48.311 "name": "pt2", 00:10:48.311 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:48.311 "is_configured": true, 00:10:48.311 "data_offset": 2048, 00:10:48.311 "data_size": 63488 00:10:48.311 } 00:10:48.311 ] 00:10:48.311 }' 00:10:48.311 15:49:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:48.311 15:49:53 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:10:49.247 [2024-06-10 15:49:54.619476] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:10:49.247 "name": "raid_bdev1", 00:10:49.247 "aliases": [ 00:10:49.247 "19191096-cb2a-4b3b-9ecf-d404bb58df1f" 00:10:49.247 ], 00:10:49.247 "product_name": "Raid Volume", 00:10:49.247 "block_size": 512, 00:10:49.247 "num_blocks": 126976, 00:10:49.247 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:49.247 "assigned_rate_limits": { 00:10:49.247 "rw_ios_per_sec": 0, 00:10:49.247 "rw_mbytes_per_sec": 0, 00:10:49.247 "r_mbytes_per_sec": 0, 00:10:49.247 "w_mbytes_per_sec": 0 00:10:49.247 }, 00:10:49.247 "claimed": false, 00:10:49.247 "zoned": false, 00:10:49.247 "supported_io_types": { 00:10:49.247 "read": true, 00:10:49.247 "write": true, 00:10:49.247 "unmap": true, 00:10:49.247 "write_zeroes": true, 00:10:49.247 "flush": true, 00:10:49.247 "reset": true, 00:10:49.247 "compare": false, 00:10:49.247 "compare_and_write": false, 00:10:49.247 "abort": false, 00:10:49.247 "nvme_admin": false, 00:10:49.247 "nvme_io": false 00:10:49.247 }, 00:10:49.247 "memory_domains": [ 00:10:49.247 { 00:10:49.247 "dma_device_id": "system", 00:10:49.247 "dma_device_type": 1 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.247 "dma_device_type": 2 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "dma_device_id": "system", 00:10:49.247 "dma_device_type": 1 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.247 "dma_device_type": 2 00:10:49.247 } 00:10:49.247 ], 00:10:49.247 "driver_specific": { 00:10:49.247 "raid": { 00:10:49.247 "uuid": "19191096-cb2a-4b3b-9ecf-d404bb58df1f", 00:10:49.247 "strip_size_kb": 64, 00:10:49.247 "state": "online", 00:10:49.247 "raid_level": "concat", 00:10:49.247 "superblock": true, 00:10:49.247 "num_base_bdevs": 2, 00:10:49.247 "num_base_bdevs_discovered": 2, 00:10:49.247 "num_base_bdevs_operational": 2, 00:10:49.247 "base_bdevs_list": [ 00:10:49.247 { 00:10:49.247 "name": "pt1", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:49.247 "is_configured": true, 00:10:49.247 "data_offset": 2048, 00:10:49.247 "data_size": 63488 00:10:49.247 }, 00:10:49.247 { 00:10:49.247 "name": "pt2", 00:10:49.247 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:49.247 "is_configured": true, 00:10:49.247 "data_offset": 2048, 00:10:49.247 "data_size": 63488 00:10:49.247 } 00:10:49.247 ] 00:10:49.247 } 00:10:49.247 } 00:10:49.247 }' 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:10:49.247 pt2' 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:10:49.247 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:49.505 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:49.505 "name": "pt1", 00:10:49.505 "aliases": [ 00:10:49.505 "00000000-0000-0000-0000-000000000001" 00:10:49.505 ], 00:10:49.505 "product_name": "passthru", 00:10:49.505 "block_size": 512, 00:10:49.505 "num_blocks": 65536, 00:10:49.505 "uuid": "00000000-0000-0000-0000-000000000001", 00:10:49.505 "assigned_rate_limits": { 00:10:49.505 "rw_ios_per_sec": 0, 00:10:49.505 "rw_mbytes_per_sec": 0, 00:10:49.505 "r_mbytes_per_sec": 0, 00:10:49.505 "w_mbytes_per_sec": 0 00:10:49.505 }, 00:10:49.505 "claimed": true, 00:10:49.505 "claim_type": "exclusive_write", 00:10:49.506 "zoned": false, 00:10:49.506 "supported_io_types": { 00:10:49.506 "read": true, 00:10:49.506 "write": true, 00:10:49.506 "unmap": true, 00:10:49.506 "write_zeroes": true, 00:10:49.506 "flush": true, 00:10:49.506 "reset": true, 00:10:49.506 "compare": false, 00:10:49.506 "compare_and_write": false, 00:10:49.506 "abort": true, 00:10:49.506 "nvme_admin": false, 00:10:49.506 "nvme_io": false 00:10:49.506 }, 00:10:49.506 "memory_domains": [ 00:10:49.506 { 00:10:49.506 "dma_device_id": "system", 00:10:49.506 "dma_device_type": 1 00:10:49.506 }, 00:10:49.506 { 00:10:49.506 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:49.506 "dma_device_type": 2 00:10:49.506 } 00:10:49.506 ], 00:10:49.506 "driver_specific": { 00:10:49.506 "passthru": { 00:10:49.506 "name": "pt1", 00:10:49.506 "base_bdev_name": "malloc1" 00:10:49.506 } 00:10:49.506 } 00:10:49.506 }' 00:10:49.506 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:49.506 15:49:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:49.764 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.022 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.022 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.022 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:10:50.022 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:10:50.022 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:10:50.308 "name": "pt2", 00:10:50.308 "aliases": [ 00:10:50.308 "00000000-0000-0000-0000-000000000002" 00:10:50.308 ], 00:10:50.308 "product_name": "passthru", 00:10:50.308 "block_size": 512, 00:10:50.308 "num_blocks": 65536, 00:10:50.308 "uuid": "00000000-0000-0000-0000-000000000002", 00:10:50.308 "assigned_rate_limits": { 00:10:50.308 "rw_ios_per_sec": 0, 00:10:50.308 "rw_mbytes_per_sec": 0, 00:10:50.308 "r_mbytes_per_sec": 0, 00:10:50.308 "w_mbytes_per_sec": 0 00:10:50.308 }, 00:10:50.308 "claimed": true, 00:10:50.308 "claim_type": "exclusive_write", 00:10:50.308 "zoned": false, 00:10:50.308 "supported_io_types": { 00:10:50.308 "read": true, 00:10:50.308 "write": true, 00:10:50.308 "unmap": true, 00:10:50.308 "write_zeroes": true, 00:10:50.308 "flush": true, 00:10:50.308 "reset": true, 00:10:50.308 "compare": false, 00:10:50.308 "compare_and_write": false, 00:10:50.308 "abort": true, 00:10:50.308 "nvme_admin": false, 00:10:50.308 "nvme_io": false 00:10:50.308 }, 00:10:50.308 "memory_domains": [ 00:10:50.308 { 00:10:50.308 "dma_device_id": "system", 00:10:50.308 "dma_device_type": 1 00:10:50.308 }, 00:10:50.308 { 00:10:50.308 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:10:50.308 "dma_device_type": 2 00:10:50.308 } 00:10:50.308 ], 00:10:50.308 "driver_specific": { 00:10:50.308 "passthru": { 00:10:50.308 "name": "pt2", 00:10:50.308 "base_bdev_name": "malloc2" 00:10:50.308 } 00:10:50.308 } 00:10:50.308 }' 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:10:50.308 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.566 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:10:50.566 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:10:50.566 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.566 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:10:50.566 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:10:50.567 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:10:50.567 15:49:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:10:50.825 [2024-06-10 15:49:56.187664] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 19191096-cb2a-4b3b-9ecf-d404bb58df1f '!=' 19191096-cb2a-4b3b-9ecf-d404bb58df1f ']' 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2642871 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2642871 ']' 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2642871 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2642871 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2642871' 00:10:50.825 killing process with pid 2642871 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2642871 00:10:50.825 [2024-06-10 15:49:56.249049] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:50.825 [2024-06-10 15:49:56.249101] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:50.825 [2024-06-10 15:49:56.249142] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:50.825 [2024-06-10 15:49:56.249150] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x147f300 name raid_bdev1, state offline 00:10:50.825 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2642871 00:10:50.825 [2024-06-10 15:49:56.265545] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:51.083 15:49:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:10:51.083 00:10:51.083 real 0m11.068s 00:10:51.083 user 0m20.228s 00:10:51.083 sys 0m1.617s 00:10:51.083 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:51.083 15:49:56 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.083 ************************************ 00:10:51.083 END TEST raid_superblock_test 00:10:51.083 ************************************ 00:10:51.083 15:49:56 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 2 read 00:10:51.083 15:49:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:51.083 15:49:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:51.083 15:49:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:51.083 ************************************ 00:10:51.083 START TEST raid_read_error_test 00:10:51.083 ************************************ 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 read 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ma1dSU3EuC 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2644913 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2644913 /var/tmp/spdk-raid.sock 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2644913 ']' 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:51.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:51.083 15:49:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:51.341 [2024-06-10 15:49:56.602415] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:51.341 [2024-06-10 15:49:56.602472] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2644913 ] 00:10:51.341 [2024-06-10 15:49:56.701628] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:51.341 [2024-06-10 15:49:56.797978] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:51.599 [2024-06-10 15:49:56.858695] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:51.599 [2024-06-10 15:49:56.858729] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:52.166 15:49:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:52.166 15:49:57 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:52.166 15:49:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:52.166 15:49:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:52.424 BaseBdev1_malloc 00:10:52.424 15:49:57 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:52.682 true 00:10:52.682 15:49:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:52.941 [2024-06-10 15:49:58.308656] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:52.941 [2024-06-10 15:49:58.308695] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:52.941 [2024-06-10 15:49:58.308711] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x158d150 00:10:52.941 [2024-06-10 15:49:58.308720] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:52.942 [2024-06-10 15:49:58.310459] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:52.942 [2024-06-10 15:49:58.310487] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:52.942 BaseBdev1 00:10:52.942 15:49:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:52.942 15:49:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:53.201 BaseBdev2_malloc 00:10:53.201 15:49:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:53.460 true 00:10:53.460 15:49:58 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:10:53.719 [2024-06-10 15:49:59.087328] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:10:53.719 [2024-06-10 15:49:59.087365] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:53.719 [2024-06-10 15:49:59.087380] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1591b50 00:10:53.719 [2024-06-10 15:49:59.087389] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:53.719 [2024-06-10 15:49:59.088877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:53.719 [2024-06-10 15:49:59.088903] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:10:53.719 BaseBdev2 00:10:53.719 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:10:53.977 [2024-06-10 15:49:59.348056] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:10:53.977 [2024-06-10 15:49:59.349307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:10:53.977 [2024-06-10 15:49:59.349485] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1592e30 00:10:53.977 [2024-06-10 15:49:59.349496] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:10:53.977 [2024-06-10 15:49:59.349672] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1593110 00:10:53.977 [2024-06-10 15:49:59.349814] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1592e30 00:10:53.977 [2024-06-10 15:49:59.349822] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1592e30 00:10:53.977 [2024-06-10 15:49:59.349921] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:53.977 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:53.978 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:53.978 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:53.978 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:54.236 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:54.236 "name": "raid_bdev1", 00:10:54.236 "uuid": "bc80eb90-4d58-41a5-921f-b457d4a06985", 00:10:54.236 "strip_size_kb": 64, 00:10:54.236 "state": "online", 00:10:54.236 "raid_level": "concat", 00:10:54.236 "superblock": true, 00:10:54.236 "num_base_bdevs": 2, 00:10:54.236 "num_base_bdevs_discovered": 2, 00:10:54.236 "num_base_bdevs_operational": 2, 00:10:54.236 "base_bdevs_list": [ 00:10:54.236 { 00:10:54.236 "name": "BaseBdev1", 00:10:54.236 "uuid": "c0210f1c-8e7e-5197-bcfc-6555bc6d6d85", 00:10:54.236 "is_configured": true, 00:10:54.236 "data_offset": 2048, 00:10:54.236 "data_size": 63488 00:10:54.236 }, 00:10:54.236 { 00:10:54.236 "name": "BaseBdev2", 00:10:54.236 "uuid": "25c0654e-ee42-50ec-8526-8f1c5a72abd8", 00:10:54.236 "is_configured": true, 00:10:54.236 "data_offset": 2048, 00:10:54.236 "data_size": 63488 00:10:54.236 } 00:10:54.236 ] 00:10:54.236 }' 00:10:54.236 15:49:59 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:54.236 15:49:59 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:54.803 15:50:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:10:54.803 15:50:00 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:10:55.061 [2024-06-10 15:50:00.375053] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13e2a40 00:10:55.997 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:10:56.261 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:10:56.261 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:10:56.261 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:10:56.262 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:10:56.521 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:10:56.521 "name": "raid_bdev1", 00:10:56.521 "uuid": "bc80eb90-4d58-41a5-921f-b457d4a06985", 00:10:56.521 "strip_size_kb": 64, 00:10:56.521 "state": "online", 00:10:56.521 "raid_level": "concat", 00:10:56.521 "superblock": true, 00:10:56.521 "num_base_bdevs": 2, 00:10:56.521 "num_base_bdevs_discovered": 2, 00:10:56.521 "num_base_bdevs_operational": 2, 00:10:56.521 "base_bdevs_list": [ 00:10:56.521 { 00:10:56.521 "name": "BaseBdev1", 00:10:56.521 "uuid": "c0210f1c-8e7e-5197-bcfc-6555bc6d6d85", 00:10:56.521 "is_configured": true, 00:10:56.521 "data_offset": 2048, 00:10:56.521 "data_size": 63488 00:10:56.521 }, 00:10:56.521 { 00:10:56.521 "name": "BaseBdev2", 00:10:56.521 "uuid": "25c0654e-ee42-50ec-8526-8f1c5a72abd8", 00:10:56.521 "is_configured": true, 00:10:56.521 "data_offset": 2048, 00:10:56.521 "data_size": 63488 00:10:56.521 } 00:10:56.521 ] 00:10:56.521 }' 00:10:56.521 15:50:01 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:10:56.521 15:50:01 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.096 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:10:57.356 [2024-06-10 15:50:02.649093] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:10:57.356 [2024-06-10 15:50:02.649135] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:10:57.356 [2024-06-10 15:50:02.652552] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:10:57.356 [2024-06-10 15:50:02.652584] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:10:57.356 [2024-06-10 15:50:02.652610] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:10:57.356 [2024-06-10 15:50:02.652624] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1592e30 name raid_bdev1, state offline 00:10:57.356 0 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2644913 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2644913 ']' 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2644913 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2644913 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2644913' 00:10:57.356 killing process with pid 2644913 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2644913 00:10:57.356 [2024-06-10 15:50:02.719438] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:10:57.356 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2644913 00:10:57.356 [2024-06-10 15:50:02.729558] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ma1dSU3EuC 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:10:57.615 00:10:57.615 real 0m6.417s 00:10:57.615 user 0m10.342s 00:10:57.615 sys 0m0.892s 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:10:57.615 15:50:02 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.615 ************************************ 00:10:57.615 END TEST raid_read_error_test 00:10:57.615 ************************************ 00:10:57.615 15:50:02 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 2 write 00:10:57.615 15:50:02 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:10:57.615 15:50:02 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:10:57.615 15:50:02 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:10:57.615 ************************************ 00:10:57.615 START TEST raid_write_error_test 00:10:57.615 ************************************ 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 2 write 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.xYnW5OAmZM 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2646122 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2646122 /var/tmp/spdk-raid.sock 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2646122 ']' 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:10:57.615 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:10:57.616 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:10:57.616 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:10:57.616 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:10:57.616 15:50:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:10:57.616 [2024-06-10 15:50:03.087107] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:10:57.616 [2024-06-10 15:50:03.087161] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2646122 ] 00:10:57.874 [2024-06-10 15:50:03.186307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:10:57.874 [2024-06-10 15:50:03.282190] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:10:57.874 [2024-06-10 15:50:03.337886] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:57.874 [2024-06-10 15:50:03.337913] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:10:58.811 15:50:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:10:58.811 15:50:04 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:10:58.811 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:58.811 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:10:58.811 BaseBdev1_malloc 00:10:58.811 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:10:59.071 true 00:10:59.071 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:10:59.330 [2024-06-10 15:50:04.791305] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:10:59.330 [2024-06-10 15:50:04.791346] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:10:59.330 [2024-06-10 15:50:04.791363] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbab150 00:10:59.330 [2024-06-10 15:50:04.791372] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:10:59.330 [2024-06-10 15:50:04.793192] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:10:59.330 [2024-06-10 15:50:04.793221] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:10:59.330 BaseBdev1 00:10:59.330 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:10:59.330 15:50:04 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:10:59.589 BaseBdev2_malloc 00:10:59.590 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:10:59.849 true 00:10:59.849 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:00.109 [2024-06-10 15:50:05.558097] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:00.109 [2024-06-10 15:50:05.558136] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:00.109 [2024-06-10 15:50:05.558152] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xbafb50 00:11:00.109 [2024-06-10 15:50:05.558162] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:00.109 [2024-06-10 15:50:05.559732] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:00.109 [2024-06-10 15:50:05.559758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:00.109 BaseBdev2 00:11:00.109 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:00.368 [2024-06-10 15:50:05.814887] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:00.368 [2024-06-10 15:50:05.816263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:00.368 [2024-06-10 15:50:05.816460] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xbb0e30 00:11:00.368 [2024-06-10 15:50:05.816474] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 126976, blocklen 512 00:11:00.368 [2024-06-10 15:50:05.816670] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xbb1110 00:11:00.368 [2024-06-10 15:50:05.816819] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xbb0e30 00:11:00.368 [2024-06-10 15:50:05.816827] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xbb0e30 00:11:00.368 [2024-06-10 15:50:05.816932] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:00.368 15:50:05 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:00.627 15:50:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:00.627 "name": "raid_bdev1", 00:11:00.627 "uuid": "713aa541-9431-4926-85ec-ec211c358f25", 00:11:00.627 "strip_size_kb": 64, 00:11:00.627 "state": "online", 00:11:00.627 "raid_level": "concat", 00:11:00.627 "superblock": true, 00:11:00.627 "num_base_bdevs": 2, 00:11:00.627 "num_base_bdevs_discovered": 2, 00:11:00.627 "num_base_bdevs_operational": 2, 00:11:00.627 "base_bdevs_list": [ 00:11:00.627 { 00:11:00.627 "name": "BaseBdev1", 00:11:00.627 "uuid": "4cc22c51-f827-5ca5-a500-804fd4cfaa31", 00:11:00.627 "is_configured": true, 00:11:00.627 "data_offset": 2048, 00:11:00.627 "data_size": 63488 00:11:00.627 }, 00:11:00.627 { 00:11:00.627 "name": "BaseBdev2", 00:11:00.627 "uuid": "381d3931-46df-56fb-a44d-3274046cb33a", 00:11:00.627 "is_configured": true, 00:11:00.627 "data_offset": 2048, 00:11:00.627 "data_size": 63488 00:11:00.627 } 00:11:00.627 ] 00:11:00.627 }' 00:11:00.627 15:50:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:00.627 15:50:06 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:01.564 15:50:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:01.564 15:50:06 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:01.564 [2024-06-10 15:50:06.837871] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa00a40 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 2 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:02.501 15:50:07 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:02.761 15:50:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:02.761 "name": "raid_bdev1", 00:11:02.761 "uuid": "713aa541-9431-4926-85ec-ec211c358f25", 00:11:02.761 "strip_size_kb": 64, 00:11:02.761 "state": "online", 00:11:02.761 "raid_level": "concat", 00:11:02.761 "superblock": true, 00:11:02.761 "num_base_bdevs": 2, 00:11:02.761 "num_base_bdevs_discovered": 2, 00:11:02.761 "num_base_bdevs_operational": 2, 00:11:02.761 "base_bdevs_list": [ 00:11:02.761 { 00:11:02.761 "name": "BaseBdev1", 00:11:02.761 "uuid": "4cc22c51-f827-5ca5-a500-804fd4cfaa31", 00:11:02.761 "is_configured": true, 00:11:02.761 "data_offset": 2048, 00:11:02.761 "data_size": 63488 00:11:02.761 }, 00:11:02.761 { 00:11:02.761 "name": "BaseBdev2", 00:11:02.761 "uuid": "381d3931-46df-56fb-a44d-3274046cb33a", 00:11:02.761 "is_configured": true, 00:11:02.761 "data_offset": 2048, 00:11:02.761 "data_size": 63488 00:11:02.761 } 00:11:02.761 ] 00:11:02.761 }' 00:11:02.761 15:50:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:02.761 15:50:08 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.330 15:50:08 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:03.590 [2024-06-10 15:50:09.064187] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:03.590 [2024-06-10 15:50:09.064227] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:03.590 [2024-06-10 15:50:09.067619] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:03.590 [2024-06-10 15:50:09.067653] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:03.590 [2024-06-10 15:50:09.067679] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:03.590 [2024-06-10 15:50:09.067687] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xbb0e30 name raid_bdev1, state offline 00:11:03.590 0 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2646122 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2646122 ']' 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2646122 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:03.590 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2646122 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2646122' 00:11:03.851 killing process with pid 2646122 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2646122 00:11:03.851 [2024-06-10 15:50:09.127380] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2646122 00:11:03.851 [2024-06-10 15:50:09.137211] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.xYnW5OAmZM 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:11:03.851 00:11:03.851 real 0m6.329s 00:11:03.851 user 0m10.176s 00:11:03.851 sys 0m0.870s 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:03.851 15:50:09 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:03.851 ************************************ 00:11:03.851 END TEST raid_write_error_test 00:11:03.851 ************************************ 00:11:04.177 15:50:09 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:04.177 15:50:09 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 2 false 00:11:04.177 15:50:09 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:04.177 15:50:09 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:04.177 15:50:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:04.177 ************************************ 00:11:04.177 START TEST raid_state_function_test 00:11:04.177 ************************************ 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 false 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2647161 00:11:04.177 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2647161' 00:11:04.177 Process raid pid: 2647161 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2647161 /var/tmp/spdk-raid.sock 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2647161 ']' 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:04.178 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:04.178 15:50:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:04.178 [2024-06-10 15:50:09.483852] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:04.178 [2024-06-10 15:50:09.483908] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:04.178 [2024-06-10 15:50:09.581824] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:04.178 [2024-06-10 15:50:09.680488] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:04.436 [2024-06-10 15:50:09.743296] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:04.436 [2024-06-10 15:50:09.743323] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:05.004 15:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:05.004 15:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:11:05.004 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:05.263 [2024-06-10 15:50:10.678322] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:05.263 [2024-06-10 15:50:10.678361] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:05.263 [2024-06-10 15:50:10.678371] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:05.263 [2024-06-10 15:50:10.678380] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:05.263 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:05.521 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:05.522 "name": "Existed_Raid", 00:11:05.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.522 "strip_size_kb": 0, 00:11:05.522 "state": "configuring", 00:11:05.522 "raid_level": "raid1", 00:11:05.522 "superblock": false, 00:11:05.522 "num_base_bdevs": 2, 00:11:05.522 "num_base_bdevs_discovered": 0, 00:11:05.522 "num_base_bdevs_operational": 2, 00:11:05.522 "base_bdevs_list": [ 00:11:05.522 { 00:11:05.522 "name": "BaseBdev1", 00:11:05.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.522 "is_configured": false, 00:11:05.522 "data_offset": 0, 00:11:05.522 "data_size": 0 00:11:05.522 }, 00:11:05.522 { 00:11:05.522 "name": "BaseBdev2", 00:11:05.522 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:05.522 "is_configured": false, 00:11:05.522 "data_offset": 0, 00:11:05.522 "data_size": 0 00:11:05.522 } 00:11:05.522 ] 00:11:05.522 }' 00:11:05.522 15:50:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:05.522 15:50:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:06.089 15:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:06.348 [2024-06-10 15:50:11.825247] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:06.348 [2024-06-10 15:50:11.825277] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ac120 name Existed_Raid, state configuring 00:11:06.348 15:50:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:06.607 [2024-06-10 15:50:12.085941] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:06.607 [2024-06-10 15:50:12.085975] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:06.607 [2024-06-10 15:50:12.085983] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:06.607 [2024-06-10 15:50:12.085992] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:06.607 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:06.865 [2024-06-10 15:50:12.348216] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:06.865 BaseBdev1 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:06.865 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:07.124 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:07.384 [ 00:11:07.384 { 00:11:07.384 "name": "BaseBdev1", 00:11:07.384 "aliases": [ 00:11:07.384 "ddb0b5bd-3558-4131-acee-850042ffa466" 00:11:07.384 ], 00:11:07.384 "product_name": "Malloc disk", 00:11:07.384 "block_size": 512, 00:11:07.384 "num_blocks": 65536, 00:11:07.384 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:07.384 "assigned_rate_limits": { 00:11:07.384 "rw_ios_per_sec": 0, 00:11:07.384 "rw_mbytes_per_sec": 0, 00:11:07.384 "r_mbytes_per_sec": 0, 00:11:07.384 "w_mbytes_per_sec": 0 00:11:07.384 }, 00:11:07.384 "claimed": true, 00:11:07.384 "claim_type": "exclusive_write", 00:11:07.384 "zoned": false, 00:11:07.384 "supported_io_types": { 00:11:07.384 "read": true, 00:11:07.384 "write": true, 00:11:07.384 "unmap": true, 00:11:07.384 "write_zeroes": true, 00:11:07.384 "flush": true, 00:11:07.384 "reset": true, 00:11:07.384 "compare": false, 00:11:07.384 "compare_and_write": false, 00:11:07.384 "abort": true, 00:11:07.384 "nvme_admin": false, 00:11:07.384 "nvme_io": false 00:11:07.384 }, 00:11:07.384 "memory_domains": [ 00:11:07.384 { 00:11:07.384 "dma_device_id": "system", 00:11:07.384 "dma_device_type": 1 00:11:07.384 }, 00:11:07.384 { 00:11:07.384 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:07.384 "dma_device_type": 2 00:11:07.384 } 00:11:07.384 ], 00:11:07.384 "driver_specific": {} 00:11:07.384 } 00:11:07.384 ] 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:07.384 15:50:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:07.644 15:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:07.644 "name": "Existed_Raid", 00:11:07.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.644 "strip_size_kb": 0, 00:11:07.644 "state": "configuring", 00:11:07.644 "raid_level": "raid1", 00:11:07.644 "superblock": false, 00:11:07.644 "num_base_bdevs": 2, 00:11:07.644 "num_base_bdevs_discovered": 1, 00:11:07.644 "num_base_bdevs_operational": 2, 00:11:07.644 "base_bdevs_list": [ 00:11:07.644 { 00:11:07.644 "name": "BaseBdev1", 00:11:07.644 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:07.644 "is_configured": true, 00:11:07.644 "data_offset": 0, 00:11:07.644 "data_size": 65536 00:11:07.644 }, 00:11:07.644 { 00:11:07.644 "name": "BaseBdev2", 00:11:07.644 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:07.644 "is_configured": false, 00:11:07.644 "data_offset": 0, 00:11:07.644 "data_size": 0 00:11:07.644 } 00:11:07.644 ] 00:11:07.644 }' 00:11:07.644 15:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:07.644 15:50:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:08.580 15:50:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:08.580 [2024-06-10 15:50:13.996606] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:08.580 [2024-06-10 15:50:13.996642] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ab9f0 name Existed_Raid, state configuring 00:11:08.580 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:08.838 [2024-06-10 15:50:14.253316] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:08.838 [2024-06-10 15:50:14.254836] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:08.838 [2024-06-10 15:50:14.254867] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:08.838 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:08.839 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:09.097 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:09.097 "name": "Existed_Raid", 00:11:09.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.097 "strip_size_kb": 0, 00:11:09.097 "state": "configuring", 00:11:09.097 "raid_level": "raid1", 00:11:09.097 "superblock": false, 00:11:09.097 "num_base_bdevs": 2, 00:11:09.097 "num_base_bdevs_discovered": 1, 00:11:09.097 "num_base_bdevs_operational": 2, 00:11:09.097 "base_bdevs_list": [ 00:11:09.097 { 00:11:09.097 "name": "BaseBdev1", 00:11:09.097 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:09.097 "is_configured": true, 00:11:09.097 "data_offset": 0, 00:11:09.097 "data_size": 65536 00:11:09.097 }, 00:11:09.097 { 00:11:09.097 "name": "BaseBdev2", 00:11:09.097 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:09.097 "is_configured": false, 00:11:09.097 "data_offset": 0, 00:11:09.097 "data_size": 0 00:11:09.097 } 00:11:09.097 ] 00:11:09.097 }' 00:11:09.097 15:50:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:09.097 15:50:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:09.665 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:09.924 [2024-06-10 15:50:15.387752] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:09.924 [2024-06-10 15:50:15.387785] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21ac770 00:11:09.924 [2024-06-10 15:50:15.387791] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:11:09.924 [2024-06-10 15:50:15.387997] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21adeb0 00:11:09.924 [2024-06-10 15:50:15.388123] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21ac770 00:11:09.924 [2024-06-10 15:50:15.388131] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21ac770 00:11:09.924 [2024-06-10 15:50:15.388294] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:09.924 BaseBdev2 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:09.924 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:10.182 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:10.442 [ 00:11:10.442 { 00:11:10.442 "name": "BaseBdev2", 00:11:10.442 "aliases": [ 00:11:10.442 "0386e6ee-acdd-4a86-b00f-79d84fb38e1a" 00:11:10.442 ], 00:11:10.442 "product_name": "Malloc disk", 00:11:10.442 "block_size": 512, 00:11:10.442 "num_blocks": 65536, 00:11:10.442 "uuid": "0386e6ee-acdd-4a86-b00f-79d84fb38e1a", 00:11:10.442 "assigned_rate_limits": { 00:11:10.442 "rw_ios_per_sec": 0, 00:11:10.442 "rw_mbytes_per_sec": 0, 00:11:10.442 "r_mbytes_per_sec": 0, 00:11:10.442 "w_mbytes_per_sec": 0 00:11:10.442 }, 00:11:10.442 "claimed": true, 00:11:10.442 "claim_type": "exclusive_write", 00:11:10.442 "zoned": false, 00:11:10.442 "supported_io_types": { 00:11:10.442 "read": true, 00:11:10.442 "write": true, 00:11:10.442 "unmap": true, 00:11:10.442 "write_zeroes": true, 00:11:10.442 "flush": true, 00:11:10.442 "reset": true, 00:11:10.442 "compare": false, 00:11:10.442 "compare_and_write": false, 00:11:10.442 "abort": true, 00:11:10.442 "nvme_admin": false, 00:11:10.442 "nvme_io": false 00:11:10.442 }, 00:11:10.442 "memory_domains": [ 00:11:10.442 { 00:11:10.442 "dma_device_id": "system", 00:11:10.442 "dma_device_type": 1 00:11:10.442 }, 00:11:10.442 { 00:11:10.442 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:10.442 "dma_device_type": 2 00:11:10.442 } 00:11:10.442 ], 00:11:10.442 "driver_specific": {} 00:11:10.442 } 00:11:10.442 ] 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:10.442 15:50:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:10.701 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:10.701 "name": "Existed_Raid", 00:11:10.701 "uuid": "145cecbc-0aed-44e4-9e01-e5ef07489d36", 00:11:10.701 "strip_size_kb": 0, 00:11:10.701 "state": "online", 00:11:10.701 "raid_level": "raid1", 00:11:10.701 "superblock": false, 00:11:10.701 "num_base_bdevs": 2, 00:11:10.701 "num_base_bdevs_discovered": 2, 00:11:10.701 "num_base_bdevs_operational": 2, 00:11:10.701 "base_bdevs_list": [ 00:11:10.701 { 00:11:10.701 "name": "BaseBdev1", 00:11:10.701 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:10.701 "is_configured": true, 00:11:10.701 "data_offset": 0, 00:11:10.701 "data_size": 65536 00:11:10.701 }, 00:11:10.701 { 00:11:10.701 "name": "BaseBdev2", 00:11:10.701 "uuid": "0386e6ee-acdd-4a86-b00f-79d84fb38e1a", 00:11:10.701 "is_configured": true, 00:11:10.701 "data_offset": 0, 00:11:10.701 "data_size": 65536 00:11:10.701 } 00:11:10.701 ] 00:11:10.701 }' 00:11:10.701 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:10.701 15:50:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:11.639 15:50:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:11.639 [2024-06-10 15:50:17.036555] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:11.639 "name": "Existed_Raid", 00:11:11.639 "aliases": [ 00:11:11.639 "145cecbc-0aed-44e4-9e01-e5ef07489d36" 00:11:11.639 ], 00:11:11.639 "product_name": "Raid Volume", 00:11:11.639 "block_size": 512, 00:11:11.639 "num_blocks": 65536, 00:11:11.639 "uuid": "145cecbc-0aed-44e4-9e01-e5ef07489d36", 00:11:11.639 "assigned_rate_limits": { 00:11:11.639 "rw_ios_per_sec": 0, 00:11:11.639 "rw_mbytes_per_sec": 0, 00:11:11.639 "r_mbytes_per_sec": 0, 00:11:11.639 "w_mbytes_per_sec": 0 00:11:11.639 }, 00:11:11.639 "claimed": false, 00:11:11.639 "zoned": false, 00:11:11.639 "supported_io_types": { 00:11:11.639 "read": true, 00:11:11.639 "write": true, 00:11:11.639 "unmap": false, 00:11:11.639 "write_zeroes": true, 00:11:11.639 "flush": false, 00:11:11.639 "reset": true, 00:11:11.639 "compare": false, 00:11:11.639 "compare_and_write": false, 00:11:11.639 "abort": false, 00:11:11.639 "nvme_admin": false, 00:11:11.639 "nvme_io": false 00:11:11.639 }, 00:11:11.639 "memory_domains": [ 00:11:11.639 { 00:11:11.639 "dma_device_id": "system", 00:11:11.639 "dma_device_type": 1 00:11:11.639 }, 00:11:11.639 { 00:11:11.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.639 "dma_device_type": 2 00:11:11.639 }, 00:11:11.639 { 00:11:11.639 "dma_device_id": "system", 00:11:11.639 "dma_device_type": 1 00:11:11.639 }, 00:11:11.639 { 00:11:11.639 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.639 "dma_device_type": 2 00:11:11.639 } 00:11:11.639 ], 00:11:11.639 "driver_specific": { 00:11:11.639 "raid": { 00:11:11.639 "uuid": "145cecbc-0aed-44e4-9e01-e5ef07489d36", 00:11:11.639 "strip_size_kb": 0, 00:11:11.639 "state": "online", 00:11:11.639 "raid_level": "raid1", 00:11:11.639 "superblock": false, 00:11:11.639 "num_base_bdevs": 2, 00:11:11.639 "num_base_bdevs_discovered": 2, 00:11:11.639 "num_base_bdevs_operational": 2, 00:11:11.639 "base_bdevs_list": [ 00:11:11.639 { 00:11:11.639 "name": "BaseBdev1", 00:11:11.639 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:11.639 "is_configured": true, 00:11:11.639 "data_offset": 0, 00:11:11.639 "data_size": 65536 00:11:11.639 }, 00:11:11.639 { 00:11:11.639 "name": "BaseBdev2", 00:11:11.639 "uuid": "0386e6ee-acdd-4a86-b00f-79d84fb38e1a", 00:11:11.639 "is_configured": true, 00:11:11.639 "data_offset": 0, 00:11:11.639 "data_size": 65536 00:11:11.639 } 00:11:11.639 ] 00:11:11.639 } 00:11:11.639 } 00:11:11.639 }' 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:11.639 BaseBdev2' 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:11.639 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:11.898 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:11.898 "name": "BaseBdev1", 00:11:11.898 "aliases": [ 00:11:11.898 "ddb0b5bd-3558-4131-acee-850042ffa466" 00:11:11.898 ], 00:11:11.898 "product_name": "Malloc disk", 00:11:11.898 "block_size": 512, 00:11:11.898 "num_blocks": 65536, 00:11:11.898 "uuid": "ddb0b5bd-3558-4131-acee-850042ffa466", 00:11:11.898 "assigned_rate_limits": { 00:11:11.898 "rw_ios_per_sec": 0, 00:11:11.898 "rw_mbytes_per_sec": 0, 00:11:11.899 "r_mbytes_per_sec": 0, 00:11:11.899 "w_mbytes_per_sec": 0 00:11:11.899 }, 00:11:11.899 "claimed": true, 00:11:11.899 "claim_type": "exclusive_write", 00:11:11.899 "zoned": false, 00:11:11.899 "supported_io_types": { 00:11:11.899 "read": true, 00:11:11.899 "write": true, 00:11:11.899 "unmap": true, 00:11:11.899 "write_zeroes": true, 00:11:11.899 "flush": true, 00:11:11.899 "reset": true, 00:11:11.899 "compare": false, 00:11:11.899 "compare_and_write": false, 00:11:11.899 "abort": true, 00:11:11.899 "nvme_admin": false, 00:11:11.899 "nvme_io": false 00:11:11.899 }, 00:11:11.899 "memory_domains": [ 00:11:11.899 { 00:11:11.899 "dma_device_id": "system", 00:11:11.899 "dma_device_type": 1 00:11:11.899 }, 00:11:11.899 { 00:11:11.899 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:11.899 "dma_device_type": 2 00:11:11.899 } 00:11:11.899 ], 00:11:11.899 "driver_specific": {} 00:11:11.899 }' 00:11:11.899 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:11.899 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:12.158 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.416 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.416 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:12.416 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:12.416 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:12.416 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:12.675 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:12.675 "name": "BaseBdev2", 00:11:12.675 "aliases": [ 00:11:12.675 "0386e6ee-acdd-4a86-b00f-79d84fb38e1a" 00:11:12.675 ], 00:11:12.675 "product_name": "Malloc disk", 00:11:12.675 "block_size": 512, 00:11:12.675 "num_blocks": 65536, 00:11:12.675 "uuid": "0386e6ee-acdd-4a86-b00f-79d84fb38e1a", 00:11:12.675 "assigned_rate_limits": { 00:11:12.675 "rw_ios_per_sec": 0, 00:11:12.675 "rw_mbytes_per_sec": 0, 00:11:12.675 "r_mbytes_per_sec": 0, 00:11:12.675 "w_mbytes_per_sec": 0 00:11:12.675 }, 00:11:12.675 "claimed": true, 00:11:12.675 "claim_type": "exclusive_write", 00:11:12.675 "zoned": false, 00:11:12.675 "supported_io_types": { 00:11:12.675 "read": true, 00:11:12.675 "write": true, 00:11:12.675 "unmap": true, 00:11:12.675 "write_zeroes": true, 00:11:12.675 "flush": true, 00:11:12.675 "reset": true, 00:11:12.675 "compare": false, 00:11:12.675 "compare_and_write": false, 00:11:12.675 "abort": true, 00:11:12.675 "nvme_admin": false, 00:11:12.675 "nvme_io": false 00:11:12.675 }, 00:11:12.675 "memory_domains": [ 00:11:12.675 { 00:11:12.675 "dma_device_id": "system", 00:11:12.675 "dma_device_type": 1 00:11:12.675 }, 00:11:12.675 { 00:11:12.675 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:12.675 "dma_device_type": 2 00:11:12.675 } 00:11:12.675 ], 00:11:12.675 "driver_specific": {} 00:11:12.675 }' 00:11:12.675 15:50:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:12.675 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:12.675 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:12.675 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.676 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:12.676 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:12.676 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:12.935 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:13.195 [2024-06-10 15:50:18.536357] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:13.195 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:13.453 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:13.453 "name": "Existed_Raid", 00:11:13.453 "uuid": "145cecbc-0aed-44e4-9e01-e5ef07489d36", 00:11:13.453 "strip_size_kb": 0, 00:11:13.453 "state": "online", 00:11:13.453 "raid_level": "raid1", 00:11:13.454 "superblock": false, 00:11:13.454 "num_base_bdevs": 2, 00:11:13.454 "num_base_bdevs_discovered": 1, 00:11:13.454 "num_base_bdevs_operational": 1, 00:11:13.454 "base_bdevs_list": [ 00:11:13.454 { 00:11:13.454 "name": null, 00:11:13.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:13.454 "is_configured": false, 00:11:13.454 "data_offset": 0, 00:11:13.454 "data_size": 65536 00:11:13.454 }, 00:11:13.454 { 00:11:13.454 "name": "BaseBdev2", 00:11:13.454 "uuid": "0386e6ee-acdd-4a86-b00f-79d84fb38e1a", 00:11:13.454 "is_configured": true, 00:11:13.454 "data_offset": 0, 00:11:13.454 "data_size": 65536 00:11:13.454 } 00:11:13.454 ] 00:11:13.454 }' 00:11:13.454 15:50:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:13.454 15:50:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:14.022 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:14.022 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:14.022 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.022 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:14.280 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:14.280 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:14.280 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:14.540 [2024-06-10 15:50:19.816941] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:14.540 [2024-06-10 15:50:19.817024] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:14.540 [2024-06-10 15:50:19.827819] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:14.540 [2024-06-10 15:50:19.827852] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:14.540 [2024-06-10 15:50:19.827860] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ac770 name Existed_Raid, state offline 00:11:14.540 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:14.540 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:14.540 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:14.540 15:50:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2647161 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2647161 ']' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2647161 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2647161 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2647161' 00:11:14.799 killing process with pid 2647161 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2647161 00:11:14.799 [2024-06-10 15:50:20.145446] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:14.799 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2647161 00:11:14.799 [2024-06-10 15:50:20.146307] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:15.058 15:50:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:11:15.058 00:11:15.058 real 0m10.922s 00:11:15.058 user 0m19.967s 00:11:15.058 sys 0m1.539s 00:11:15.058 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:15.058 15:50:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:15.059 ************************************ 00:11:15.059 END TEST raid_state_function_test 00:11:15.059 ************************************ 00:11:15.059 15:50:20 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 2 true 00:11:15.059 15:50:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:15.059 15:50:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:15.059 15:50:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:15.059 ************************************ 00:11:15.059 START TEST raid_state_function_test_sb 00:11:15.059 ************************************ 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2649205 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2649205' 00:11:15.059 Process raid pid: 2649205 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2649205 /var/tmp/spdk-raid.sock 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2649205 ']' 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:15.059 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:15.059 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:15.059 [2024-06-10 15:50:20.443346] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:15.059 [2024-06-10 15:50:20.443388] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:15.059 [2024-06-10 15:50:20.527385] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:15.318 [2024-06-10 15:50:20.624535] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:15.318 [2024-06-10 15:50:20.687186] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:15.318 [2024-06-10 15:50:20.687218] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:15.319 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:15.319 15:50:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:11:15.319 15:50:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:15.578 [2024-06-10 15:50:20.981602] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:15.578 [2024-06-10 15:50:20.981639] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:15.578 [2024-06-10 15:50:20.981648] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:15.578 [2024-06-10 15:50:20.981657] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:15.578 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:15.837 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:15.837 "name": "Existed_Raid", 00:11:15.837 "uuid": "2e31656d-de6c-45fd-9d37-f43fbe74c97f", 00:11:15.837 "strip_size_kb": 0, 00:11:15.837 "state": "configuring", 00:11:15.837 "raid_level": "raid1", 00:11:15.837 "superblock": true, 00:11:15.837 "num_base_bdevs": 2, 00:11:15.837 "num_base_bdevs_discovered": 0, 00:11:15.837 "num_base_bdevs_operational": 2, 00:11:15.837 "base_bdevs_list": [ 00:11:15.837 { 00:11:15.837 "name": "BaseBdev1", 00:11:15.837 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.837 "is_configured": false, 00:11:15.837 "data_offset": 0, 00:11:15.837 "data_size": 0 00:11:15.838 }, 00:11:15.838 { 00:11:15.838 "name": "BaseBdev2", 00:11:15.838 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:15.838 "is_configured": false, 00:11:15.838 "data_offset": 0, 00:11:15.838 "data_size": 0 00:11:15.838 } 00:11:15.838 ] 00:11:15.838 }' 00:11:15.838 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:15.838 15:50:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:16.405 15:50:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:16.664 [2024-06-10 15:50:22.088412] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:16.664 [2024-06-10 15:50:22.088439] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a4120 name Existed_Raid, state configuring 00:11:16.664 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:16.922 [2024-06-10 15:50:22.248856] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:16.922 [2024-06-10 15:50:22.248882] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:16.923 [2024-06-10 15:50:22.248889] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:16.923 [2024-06-10 15:50:22.248897] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:16.923 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:17.181 [2024-06-10 15:50:22.507082] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:17.181 BaseBdev1 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:17.181 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:17.441 [ 00:11:17.441 { 00:11:17.441 "name": "BaseBdev1", 00:11:17.441 "aliases": [ 00:11:17.441 "409da650-ab5c-4524-9904-f42810106791" 00:11:17.441 ], 00:11:17.441 "product_name": "Malloc disk", 00:11:17.441 "block_size": 512, 00:11:17.441 "num_blocks": 65536, 00:11:17.441 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:17.441 "assigned_rate_limits": { 00:11:17.441 "rw_ios_per_sec": 0, 00:11:17.441 "rw_mbytes_per_sec": 0, 00:11:17.441 "r_mbytes_per_sec": 0, 00:11:17.441 "w_mbytes_per_sec": 0 00:11:17.441 }, 00:11:17.441 "claimed": true, 00:11:17.441 "claim_type": "exclusive_write", 00:11:17.441 "zoned": false, 00:11:17.441 "supported_io_types": { 00:11:17.441 "read": true, 00:11:17.441 "write": true, 00:11:17.441 "unmap": true, 00:11:17.441 "write_zeroes": true, 00:11:17.441 "flush": true, 00:11:17.441 "reset": true, 00:11:17.441 "compare": false, 00:11:17.441 "compare_and_write": false, 00:11:17.441 "abort": true, 00:11:17.441 "nvme_admin": false, 00:11:17.441 "nvme_io": false 00:11:17.441 }, 00:11:17.441 "memory_domains": [ 00:11:17.441 { 00:11:17.441 "dma_device_id": "system", 00:11:17.441 "dma_device_type": 1 00:11:17.441 }, 00:11:17.441 { 00:11:17.441 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:17.441 "dma_device_type": 2 00:11:17.441 } 00:11:17.441 ], 00:11:17.441 "driver_specific": {} 00:11:17.441 } 00:11:17.441 ] 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:17.441 15:50:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:17.700 15:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:17.700 "name": "Existed_Raid", 00:11:17.700 "uuid": "b57ce500-c12d-4505-a331-ef84f2a693ea", 00:11:17.700 "strip_size_kb": 0, 00:11:17.700 "state": "configuring", 00:11:17.700 "raid_level": "raid1", 00:11:17.700 "superblock": true, 00:11:17.700 "num_base_bdevs": 2, 00:11:17.700 "num_base_bdevs_discovered": 1, 00:11:17.700 "num_base_bdevs_operational": 2, 00:11:17.700 "base_bdevs_list": [ 00:11:17.700 { 00:11:17.700 "name": "BaseBdev1", 00:11:17.700 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:17.700 "is_configured": true, 00:11:17.700 "data_offset": 2048, 00:11:17.700 "data_size": 63488 00:11:17.700 }, 00:11:17.700 { 00:11:17.700 "name": "BaseBdev2", 00:11:17.700 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:17.700 "is_configured": false, 00:11:17.700 "data_offset": 0, 00:11:17.700 "data_size": 0 00:11:17.700 } 00:11:17.700 ] 00:11:17.700 }' 00:11:17.700 15:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:17.700 15:50:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:18.637 15:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:18.637 [2024-06-10 15:50:23.934869] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:18.637 [2024-06-10 15:50:23.934908] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a39f0 name Existed_Raid, state configuring 00:11:18.637 15:50:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:11:18.936 [2024-06-10 15:50:24.183558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:18.936 [2024-06-10 15:50:24.185095] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:18.936 [2024-06-10 15:50:24.185125] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:18.936 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:19.227 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:19.227 "name": "Existed_Raid", 00:11:19.227 "uuid": "a53611fb-1cb7-4e58-8563-0072b91564b2", 00:11:19.227 "strip_size_kb": 0, 00:11:19.227 "state": "configuring", 00:11:19.227 "raid_level": "raid1", 00:11:19.227 "superblock": true, 00:11:19.227 "num_base_bdevs": 2, 00:11:19.227 "num_base_bdevs_discovered": 1, 00:11:19.227 "num_base_bdevs_operational": 2, 00:11:19.227 "base_bdevs_list": [ 00:11:19.227 { 00:11:19.227 "name": "BaseBdev1", 00:11:19.227 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:19.227 "is_configured": true, 00:11:19.227 "data_offset": 2048, 00:11:19.227 "data_size": 63488 00:11:19.227 }, 00:11:19.227 { 00:11:19.227 "name": "BaseBdev2", 00:11:19.227 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:19.227 "is_configured": false, 00:11:19.227 "data_offset": 0, 00:11:19.227 "data_size": 0 00:11:19.227 } 00:11:19.227 ] 00:11:19.227 }' 00:11:19.227 15:50:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:19.227 15:50:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:19.795 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:20.054 [2024-06-10 15:50:25.313853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:20.054 [2024-06-10 15:50:25.314015] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25a4770 00:11:20.054 [2024-06-10 15:50:25.314029] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:20.054 [2024-06-10 15:50:25.314210] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x25a5eb0 00:11:20.054 [2024-06-10 15:50:25.314333] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25a4770 00:11:20.054 [2024-06-10 15:50:25.314341] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x25a4770 00:11:20.054 [2024-06-10 15:50:25.314434] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:20.054 BaseBdev2 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:20.054 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:20.312 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:11:20.312 [ 00:11:20.312 { 00:11:20.312 "name": "BaseBdev2", 00:11:20.312 "aliases": [ 00:11:20.312 "31739a72-56b7-425e-86ce-16cee5494170" 00:11:20.312 ], 00:11:20.312 "product_name": "Malloc disk", 00:11:20.312 "block_size": 512, 00:11:20.312 "num_blocks": 65536, 00:11:20.312 "uuid": "31739a72-56b7-425e-86ce-16cee5494170", 00:11:20.312 "assigned_rate_limits": { 00:11:20.312 "rw_ios_per_sec": 0, 00:11:20.312 "rw_mbytes_per_sec": 0, 00:11:20.312 "r_mbytes_per_sec": 0, 00:11:20.312 "w_mbytes_per_sec": 0 00:11:20.312 }, 00:11:20.312 "claimed": true, 00:11:20.312 "claim_type": "exclusive_write", 00:11:20.312 "zoned": false, 00:11:20.312 "supported_io_types": { 00:11:20.312 "read": true, 00:11:20.312 "write": true, 00:11:20.312 "unmap": true, 00:11:20.312 "write_zeroes": true, 00:11:20.312 "flush": true, 00:11:20.312 "reset": true, 00:11:20.312 "compare": false, 00:11:20.312 "compare_and_write": false, 00:11:20.312 "abort": true, 00:11:20.312 "nvme_admin": false, 00:11:20.312 "nvme_io": false 00:11:20.312 }, 00:11:20.312 "memory_domains": [ 00:11:20.312 { 00:11:20.312 "dma_device_id": "system", 00:11:20.312 "dma_device_type": 1 00:11:20.312 }, 00:11:20.312 { 00:11:20.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:20.312 "dma_device_type": 2 00:11:20.312 } 00:11:20.312 ], 00:11:20.312 "driver_specific": {} 00:11:20.312 } 00:11:20.312 ] 00:11:20.312 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:11:20.312 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:11:20.312 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:20.312 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:20.313 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:20.570 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:20.570 "name": "Existed_Raid", 00:11:20.570 "uuid": "a53611fb-1cb7-4e58-8563-0072b91564b2", 00:11:20.570 "strip_size_kb": 0, 00:11:20.570 "state": "online", 00:11:20.570 "raid_level": "raid1", 00:11:20.570 "superblock": true, 00:11:20.570 "num_base_bdevs": 2, 00:11:20.570 "num_base_bdevs_discovered": 2, 00:11:20.570 "num_base_bdevs_operational": 2, 00:11:20.570 "base_bdevs_list": [ 00:11:20.570 { 00:11:20.570 "name": "BaseBdev1", 00:11:20.570 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:20.570 "is_configured": true, 00:11:20.570 "data_offset": 2048, 00:11:20.570 "data_size": 63488 00:11:20.570 }, 00:11:20.570 { 00:11:20.570 "name": "BaseBdev2", 00:11:20.570 "uuid": "31739a72-56b7-425e-86ce-16cee5494170", 00:11:20.570 "is_configured": true, 00:11:20.570 "data_offset": 2048, 00:11:20.570 "data_size": 63488 00:11:20.570 } 00:11:20.570 ] 00:11:20.570 }' 00:11:20.570 15:50:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:20.570 15:50:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:11:21.134 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:21.391 [2024-06-10 15:50:26.741938] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:21.391 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:21.391 "name": "Existed_Raid", 00:11:21.391 "aliases": [ 00:11:21.391 "a53611fb-1cb7-4e58-8563-0072b91564b2" 00:11:21.391 ], 00:11:21.391 "product_name": "Raid Volume", 00:11:21.391 "block_size": 512, 00:11:21.391 "num_blocks": 63488, 00:11:21.391 "uuid": "a53611fb-1cb7-4e58-8563-0072b91564b2", 00:11:21.391 "assigned_rate_limits": { 00:11:21.391 "rw_ios_per_sec": 0, 00:11:21.391 "rw_mbytes_per_sec": 0, 00:11:21.391 "r_mbytes_per_sec": 0, 00:11:21.391 "w_mbytes_per_sec": 0 00:11:21.391 }, 00:11:21.391 "claimed": false, 00:11:21.391 "zoned": false, 00:11:21.391 "supported_io_types": { 00:11:21.391 "read": true, 00:11:21.391 "write": true, 00:11:21.391 "unmap": false, 00:11:21.391 "write_zeroes": true, 00:11:21.391 "flush": false, 00:11:21.391 "reset": true, 00:11:21.391 "compare": false, 00:11:21.391 "compare_and_write": false, 00:11:21.391 "abort": false, 00:11:21.391 "nvme_admin": false, 00:11:21.391 "nvme_io": false 00:11:21.391 }, 00:11:21.391 "memory_domains": [ 00:11:21.391 { 00:11:21.391 "dma_device_id": "system", 00:11:21.391 "dma_device_type": 1 00:11:21.391 }, 00:11:21.391 { 00:11:21.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.391 "dma_device_type": 2 00:11:21.391 }, 00:11:21.391 { 00:11:21.391 "dma_device_id": "system", 00:11:21.391 "dma_device_type": 1 00:11:21.391 }, 00:11:21.391 { 00:11:21.391 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.391 "dma_device_type": 2 00:11:21.391 } 00:11:21.391 ], 00:11:21.391 "driver_specific": { 00:11:21.391 "raid": { 00:11:21.391 "uuid": "a53611fb-1cb7-4e58-8563-0072b91564b2", 00:11:21.391 "strip_size_kb": 0, 00:11:21.391 "state": "online", 00:11:21.391 "raid_level": "raid1", 00:11:21.391 "superblock": true, 00:11:21.391 "num_base_bdevs": 2, 00:11:21.391 "num_base_bdevs_discovered": 2, 00:11:21.391 "num_base_bdevs_operational": 2, 00:11:21.391 "base_bdevs_list": [ 00:11:21.391 { 00:11:21.391 "name": "BaseBdev1", 00:11:21.391 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:21.391 "is_configured": true, 00:11:21.391 "data_offset": 2048, 00:11:21.391 "data_size": 63488 00:11:21.391 }, 00:11:21.391 { 00:11:21.391 "name": "BaseBdev2", 00:11:21.392 "uuid": "31739a72-56b7-425e-86ce-16cee5494170", 00:11:21.392 "is_configured": true, 00:11:21.392 "data_offset": 2048, 00:11:21.392 "data_size": 63488 00:11:21.392 } 00:11:21.392 ] 00:11:21.392 } 00:11:21.392 } 00:11:21.392 }' 00:11:21.392 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:21.392 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:11:21.392 BaseBdev2' 00:11:21.392 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:21.392 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:11:21.392 15:50:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:21.649 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:21.649 "name": "BaseBdev1", 00:11:21.649 "aliases": [ 00:11:21.649 "409da650-ab5c-4524-9904-f42810106791" 00:11:21.649 ], 00:11:21.649 "product_name": "Malloc disk", 00:11:21.649 "block_size": 512, 00:11:21.649 "num_blocks": 65536, 00:11:21.649 "uuid": "409da650-ab5c-4524-9904-f42810106791", 00:11:21.649 "assigned_rate_limits": { 00:11:21.649 "rw_ios_per_sec": 0, 00:11:21.649 "rw_mbytes_per_sec": 0, 00:11:21.649 "r_mbytes_per_sec": 0, 00:11:21.649 "w_mbytes_per_sec": 0 00:11:21.649 }, 00:11:21.649 "claimed": true, 00:11:21.649 "claim_type": "exclusive_write", 00:11:21.649 "zoned": false, 00:11:21.649 "supported_io_types": { 00:11:21.649 "read": true, 00:11:21.649 "write": true, 00:11:21.649 "unmap": true, 00:11:21.649 "write_zeroes": true, 00:11:21.649 "flush": true, 00:11:21.649 "reset": true, 00:11:21.649 "compare": false, 00:11:21.649 "compare_and_write": false, 00:11:21.649 "abort": true, 00:11:21.649 "nvme_admin": false, 00:11:21.649 "nvme_io": false 00:11:21.649 }, 00:11:21.649 "memory_domains": [ 00:11:21.649 { 00:11:21.649 "dma_device_id": "system", 00:11:21.649 "dma_device_type": 1 00:11:21.649 }, 00:11:21.649 { 00:11:21.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:21.649 "dma_device_type": 2 00:11:21.649 } 00:11:21.649 ], 00:11:21.649 "driver_specific": {} 00:11:21.649 }' 00:11:21.649 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.649 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:21.906 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.164 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.164 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:22.164 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:11:22.164 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:22.422 "name": "BaseBdev2", 00:11:22.422 "aliases": [ 00:11:22.422 "31739a72-56b7-425e-86ce-16cee5494170" 00:11:22.422 ], 00:11:22.422 "product_name": "Malloc disk", 00:11:22.422 "block_size": 512, 00:11:22.422 "num_blocks": 65536, 00:11:22.422 "uuid": "31739a72-56b7-425e-86ce-16cee5494170", 00:11:22.422 "assigned_rate_limits": { 00:11:22.422 "rw_ios_per_sec": 0, 00:11:22.422 "rw_mbytes_per_sec": 0, 00:11:22.422 "r_mbytes_per_sec": 0, 00:11:22.422 "w_mbytes_per_sec": 0 00:11:22.422 }, 00:11:22.422 "claimed": true, 00:11:22.422 "claim_type": "exclusive_write", 00:11:22.422 "zoned": false, 00:11:22.422 "supported_io_types": { 00:11:22.422 "read": true, 00:11:22.422 "write": true, 00:11:22.422 "unmap": true, 00:11:22.422 "write_zeroes": true, 00:11:22.422 "flush": true, 00:11:22.422 "reset": true, 00:11:22.422 "compare": false, 00:11:22.422 "compare_and_write": false, 00:11:22.422 "abort": true, 00:11:22.422 "nvme_admin": false, 00:11:22.422 "nvme_io": false 00:11:22.422 }, 00:11:22.422 "memory_domains": [ 00:11:22.422 { 00:11:22.422 "dma_device_id": "system", 00:11:22.422 "dma_device_type": 1 00:11:22.422 }, 00:11:22.422 { 00:11:22.422 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:22.422 "dma_device_type": 2 00:11:22.422 } 00:11:22.422 ], 00:11:22.422 "driver_specific": {} 00:11:22.422 }' 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.422 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:22.680 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:22.680 15:50:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.680 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:22.680 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:22.681 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:11:22.939 [2024-06-10 15:50:28.221688] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:22.939 "name": "Existed_Raid", 00:11:22.939 "uuid": "a53611fb-1cb7-4e58-8563-0072b91564b2", 00:11:22.939 "strip_size_kb": 0, 00:11:22.939 "state": "online", 00:11:22.939 "raid_level": "raid1", 00:11:22.939 "superblock": true, 00:11:22.939 "num_base_bdevs": 2, 00:11:22.939 "num_base_bdevs_discovered": 1, 00:11:22.939 "num_base_bdevs_operational": 1, 00:11:22.939 "base_bdevs_list": [ 00:11:22.939 { 00:11:22.939 "name": null, 00:11:22.939 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:22.939 "is_configured": false, 00:11:22.939 "data_offset": 2048, 00:11:22.939 "data_size": 63488 00:11:22.939 }, 00:11:22.939 { 00:11:22.939 "name": "BaseBdev2", 00:11:22.939 "uuid": "31739a72-56b7-425e-86ce-16cee5494170", 00:11:22.939 "is_configured": true, 00:11:22.939 "data_offset": 2048, 00:11:22.939 "data_size": 63488 00:11:22.939 } 00:11:22.939 ] 00:11:22.939 }' 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:22.939 15:50:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:11:23.875 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:11:24.133 [2024-06-10 15:50:29.538325] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:11:24.133 [2024-06-10 15:50:29.538407] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:24.133 [2024-06-10 15:50:29.549264] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:24.133 [2024-06-10 15:50:29.549295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:24.133 [2024-06-10 15:50:29.549304] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25a4770 name Existed_Raid, state offline 00:11:24.133 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:11:24.133 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:11:24.133 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:24.133 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2649205 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2649205 ']' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2649205 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2649205 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2649205' 00:11:24.391 killing process with pid 2649205 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2649205 00:11:24.391 [2024-06-10 15:50:29.873897] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:24.391 15:50:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2649205 00:11:24.391 [2024-06-10 15:50:29.874769] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:24.650 15:50:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:11:24.650 00:11:24.650 real 0m9.666s 00:11:24.650 user 0m17.976s 00:11:24.650 sys 0m1.447s 00:11:24.650 15:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:24.650 15:50:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:11:24.650 ************************************ 00:11:24.650 END TEST raid_state_function_test_sb 00:11:24.650 ************************************ 00:11:24.650 15:50:30 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 2 00:11:24.650 15:50:30 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:11:24.650 15:50:30 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:24.650 15:50:30 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:24.650 ************************************ 00:11:24.650 START TEST raid_superblock_test 00:11:24.650 ************************************ 00:11:24.650 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:11:24.650 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2651027 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2651027 /var/tmp/spdk-raid.sock 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2651027 ']' 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:24.651 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:24.651 15:50:30 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:24.909 [2024-06-10 15:50:30.202811] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:24.909 [2024-06-10 15:50:30.202865] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2651027 ] 00:11:24.909 [2024-06-10 15:50:30.302387] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:24.909 [2024-06-10 15:50:30.397548] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:25.167 [2024-06-10 15:50:30.455457] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.167 [2024-06-10 15:50:30.455486] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:25.733 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:11:25.991 malloc1 00:11:25.991 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:26.250 [2024-06-10 15:50:31.652770] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:26.250 [2024-06-10 15:50:31.652815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.250 [2024-06-10 15:50:31.652832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e30f0 00:11:26.250 [2024-06-10 15:50:31.652841] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.250 [2024-06-10 15:50:31.654566] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.250 [2024-06-10 15:50:31.654593] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:26.250 pt1 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:11:26.250 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:11:26.509 malloc2 00:11:26.509 15:50:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:26.768 [2024-06-10 15:50:32.102604] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:26.768 [2024-06-10 15:50:32.102644] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:26.768 [2024-06-10 15:50:32.102659] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e4400 00:11:26.768 [2024-06-10 15:50:32.102668] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:26.768 [2024-06-10 15:50:32.104250] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:26.768 [2024-06-10 15:50:32.104276] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:26.768 pt2 00:11:26.768 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:11:26.768 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:11:26.768 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:11:27.027 [2024-06-10 15:50:32.359298] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:27.027 [2024-06-10 15:50:32.360639] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:27.027 [2024-06-10 15:50:32.360792] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x148fe60 00:11:27.027 [2024-06-10 15:50:32.360803] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:27.027 [2024-06-10 15:50:32.361001] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12f9fe0 00:11:27.027 [2024-06-10 15:50:32.361153] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x148fe60 00:11:27.027 [2024-06-10 15:50:32.361162] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x148fe60 00:11:27.027 [2024-06-10 15:50:32.361260] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:27.027 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:27.286 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:27.286 "name": "raid_bdev1", 00:11:27.286 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:27.286 "strip_size_kb": 0, 00:11:27.286 "state": "online", 00:11:27.286 "raid_level": "raid1", 00:11:27.286 "superblock": true, 00:11:27.286 "num_base_bdevs": 2, 00:11:27.286 "num_base_bdevs_discovered": 2, 00:11:27.286 "num_base_bdevs_operational": 2, 00:11:27.286 "base_bdevs_list": [ 00:11:27.286 { 00:11:27.286 "name": "pt1", 00:11:27.286 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:27.286 "is_configured": true, 00:11:27.286 "data_offset": 2048, 00:11:27.286 "data_size": 63488 00:11:27.286 }, 00:11:27.286 { 00:11:27.286 "name": "pt2", 00:11:27.286 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:27.286 "is_configured": true, 00:11:27.286 "data_offset": 2048, 00:11:27.286 "data_size": 63488 00:11:27.286 } 00:11:27.286 ] 00:11:27.286 }' 00:11:27.286 15:50:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:27.286 15:50:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:27.854 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:28.113 [2024-06-10 15:50:33.486506] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:28.113 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:28.113 "name": "raid_bdev1", 00:11:28.113 "aliases": [ 00:11:28.113 "b6eee511-235d-4cc9-85e0-33761b061ed2" 00:11:28.113 ], 00:11:28.113 "product_name": "Raid Volume", 00:11:28.113 "block_size": 512, 00:11:28.113 "num_blocks": 63488, 00:11:28.113 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:28.113 "assigned_rate_limits": { 00:11:28.113 "rw_ios_per_sec": 0, 00:11:28.113 "rw_mbytes_per_sec": 0, 00:11:28.113 "r_mbytes_per_sec": 0, 00:11:28.113 "w_mbytes_per_sec": 0 00:11:28.113 }, 00:11:28.113 "claimed": false, 00:11:28.113 "zoned": false, 00:11:28.113 "supported_io_types": { 00:11:28.113 "read": true, 00:11:28.113 "write": true, 00:11:28.113 "unmap": false, 00:11:28.113 "write_zeroes": true, 00:11:28.113 "flush": false, 00:11:28.113 "reset": true, 00:11:28.113 "compare": false, 00:11:28.113 "compare_and_write": false, 00:11:28.113 "abort": false, 00:11:28.113 "nvme_admin": false, 00:11:28.113 "nvme_io": false 00:11:28.113 }, 00:11:28.113 "memory_domains": [ 00:11:28.113 { 00:11:28.113 "dma_device_id": "system", 00:11:28.113 "dma_device_type": 1 00:11:28.113 }, 00:11:28.113 { 00:11:28.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.113 "dma_device_type": 2 00:11:28.113 }, 00:11:28.113 { 00:11:28.113 "dma_device_id": "system", 00:11:28.113 "dma_device_type": 1 00:11:28.113 }, 00:11:28.113 { 00:11:28.113 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.113 "dma_device_type": 2 00:11:28.113 } 00:11:28.113 ], 00:11:28.113 "driver_specific": { 00:11:28.113 "raid": { 00:11:28.113 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:28.113 "strip_size_kb": 0, 00:11:28.114 "state": "online", 00:11:28.114 "raid_level": "raid1", 00:11:28.114 "superblock": true, 00:11:28.114 "num_base_bdevs": 2, 00:11:28.114 "num_base_bdevs_discovered": 2, 00:11:28.114 "num_base_bdevs_operational": 2, 00:11:28.114 "base_bdevs_list": [ 00:11:28.114 { 00:11:28.114 "name": "pt1", 00:11:28.114 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:28.114 "is_configured": true, 00:11:28.114 "data_offset": 2048, 00:11:28.114 "data_size": 63488 00:11:28.114 }, 00:11:28.114 { 00:11:28.114 "name": "pt2", 00:11:28.114 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:28.114 "is_configured": true, 00:11:28.114 "data_offset": 2048, 00:11:28.114 "data_size": 63488 00:11:28.114 } 00:11:28.114 ] 00:11:28.114 } 00:11:28.114 } 00:11:28.114 }' 00:11:28.114 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:28.114 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:28.114 pt2' 00:11:28.114 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.114 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:28.114 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:28.372 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:28.372 "name": "pt1", 00:11:28.372 "aliases": [ 00:11:28.372 "00000000-0000-0000-0000-000000000001" 00:11:28.372 ], 00:11:28.372 "product_name": "passthru", 00:11:28.372 "block_size": 512, 00:11:28.372 "num_blocks": 65536, 00:11:28.372 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:28.372 "assigned_rate_limits": { 00:11:28.372 "rw_ios_per_sec": 0, 00:11:28.372 "rw_mbytes_per_sec": 0, 00:11:28.372 "r_mbytes_per_sec": 0, 00:11:28.372 "w_mbytes_per_sec": 0 00:11:28.372 }, 00:11:28.372 "claimed": true, 00:11:28.372 "claim_type": "exclusive_write", 00:11:28.372 "zoned": false, 00:11:28.372 "supported_io_types": { 00:11:28.372 "read": true, 00:11:28.372 "write": true, 00:11:28.372 "unmap": true, 00:11:28.372 "write_zeroes": true, 00:11:28.372 "flush": true, 00:11:28.372 "reset": true, 00:11:28.372 "compare": false, 00:11:28.372 "compare_and_write": false, 00:11:28.372 "abort": true, 00:11:28.372 "nvme_admin": false, 00:11:28.372 "nvme_io": false 00:11:28.372 }, 00:11:28.372 "memory_domains": [ 00:11:28.372 { 00:11:28.372 "dma_device_id": "system", 00:11:28.372 "dma_device_type": 1 00:11:28.372 }, 00:11:28.372 { 00:11:28.372 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:28.372 "dma_device_type": 2 00:11:28.372 } 00:11:28.372 ], 00:11:28.372 "driver_specific": { 00:11:28.372 "passthru": { 00:11:28.372 "name": "pt1", 00:11:28.372 "base_bdev_name": "malloc1" 00:11:28.372 } 00:11:28.373 } 00:11:28.373 }' 00:11:28.373 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.373 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:28.631 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:28.631 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.631 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:28.631 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:28.631 15:50:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.632 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:28.632 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:28.632 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.632 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:28.891 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:28.891 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:28.891 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:28.891 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:29.150 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:29.150 "name": "pt2", 00:11:29.150 "aliases": [ 00:11:29.150 "00000000-0000-0000-0000-000000000002" 00:11:29.150 ], 00:11:29.150 "product_name": "passthru", 00:11:29.150 "block_size": 512, 00:11:29.150 "num_blocks": 65536, 00:11:29.150 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:29.150 "assigned_rate_limits": { 00:11:29.150 "rw_ios_per_sec": 0, 00:11:29.150 "rw_mbytes_per_sec": 0, 00:11:29.150 "r_mbytes_per_sec": 0, 00:11:29.150 "w_mbytes_per_sec": 0 00:11:29.150 }, 00:11:29.150 "claimed": true, 00:11:29.150 "claim_type": "exclusive_write", 00:11:29.150 "zoned": false, 00:11:29.150 "supported_io_types": { 00:11:29.150 "read": true, 00:11:29.150 "write": true, 00:11:29.150 "unmap": true, 00:11:29.150 "write_zeroes": true, 00:11:29.150 "flush": true, 00:11:29.150 "reset": true, 00:11:29.150 "compare": false, 00:11:29.150 "compare_and_write": false, 00:11:29.150 "abort": true, 00:11:29.150 "nvme_admin": false, 00:11:29.150 "nvme_io": false 00:11:29.150 }, 00:11:29.150 "memory_domains": [ 00:11:29.150 { 00:11:29.150 "dma_device_id": "system", 00:11:29.150 "dma_device_type": 1 00:11:29.150 }, 00:11:29.150 { 00:11:29.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:29.150 "dma_device_type": 2 00:11:29.151 } 00:11:29.151 ], 00:11:29.151 "driver_specific": { 00:11:29.151 "passthru": { 00:11:29.151 "name": "pt2", 00:11:29.151 "base_bdev_name": "malloc2" 00:11:29.151 } 00:11:29.151 } 00:11:29.151 }' 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.151 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:29.410 15:50:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:11:29.669 [2024-06-10 15:50:35.014576] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:29.669 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=b6eee511-235d-4cc9-85e0-33761b061ed2 00:11:29.669 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z b6eee511-235d-4cc9-85e0-33761b061ed2 ']' 00:11:29.669 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:29.928 [2024-06-10 15:50:35.271035] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:29.928 [2024-06-10 15:50:35.271053] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:29.928 [2024-06-10 15:50:35.271103] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:29.928 [2024-06-10 15:50:35.271155] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:29.928 [2024-06-10 15:50:35.271163] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x148fe60 name raid_bdev1, state offline 00:11:29.928 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:29.928 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:11:30.187 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:11:30.187 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:11:30.187 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:30.187 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:30.446 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:11:30.446 15:50:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:30.705 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:11:30.705 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:11:30.965 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:11:31.223 [2024-06-10 15:50:36.558416] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:11:31.223 [2024-06-10 15:50:36.559838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:11:31.223 [2024-06-10 15:50:36.559892] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:11:31.223 [2024-06-10 15:50:36.559927] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:11:31.223 [2024-06-10 15:50:36.559942] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:31.223 [2024-06-10 15:50:36.559950] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x148d520 name raid_bdev1, state configuring 00:11:31.223 request: 00:11:31.223 { 00:11:31.223 "name": "raid_bdev1", 00:11:31.223 "raid_level": "raid1", 00:11:31.223 "base_bdevs": [ 00:11:31.223 "malloc1", 00:11:31.223 "malloc2" 00:11:31.223 ], 00:11:31.223 "superblock": false, 00:11:31.223 "method": "bdev_raid_create", 00:11:31.223 "req_id": 1 00:11:31.223 } 00:11:31.223 Got JSON-RPC error response 00:11:31.223 response: 00:11:31.223 { 00:11:31.223 "code": -17, 00:11:31.223 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:11:31.223 } 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.223 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:11:31.482 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:11:31.482 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:11:31.482 15:50:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:31.741 [2024-06-10 15:50:37.063712] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:31.741 [2024-06-10 15:50:37.063750] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:31.741 [2024-06-10 15:50:37.063765] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148d9b0 00:11:31.741 [2024-06-10 15:50:37.063774] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:31.741 [2024-06-10 15:50:37.065434] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:31.741 [2024-06-10 15:50:37.065467] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:31.741 [2024-06-10 15:50:37.065528] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:31.741 [2024-06-10 15:50:37.065552] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:31.741 pt1 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:31.741 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:32.000 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:32.000 "name": "raid_bdev1", 00:11:32.000 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:32.000 "strip_size_kb": 0, 00:11:32.000 "state": "configuring", 00:11:32.000 "raid_level": "raid1", 00:11:32.000 "superblock": true, 00:11:32.000 "num_base_bdevs": 2, 00:11:32.000 "num_base_bdevs_discovered": 1, 00:11:32.000 "num_base_bdevs_operational": 2, 00:11:32.000 "base_bdevs_list": [ 00:11:32.000 { 00:11:32.000 "name": "pt1", 00:11:32.000 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:32.000 "is_configured": true, 00:11:32.000 "data_offset": 2048, 00:11:32.000 "data_size": 63488 00:11:32.000 }, 00:11:32.000 { 00:11:32.000 "name": null, 00:11:32.000 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:32.000 "is_configured": false, 00:11:32.000 "data_offset": 2048, 00:11:32.000 "data_size": 63488 00:11:32.000 } 00:11:32.000 ] 00:11:32.000 }' 00:11:32.000 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:32.000 15:50:37 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:32.568 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:11:32.568 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:11:32.568 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:32.568 15:50:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:32.829 [2024-06-10 15:50:38.198749] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:32.829 [2024-06-10 15:50:38.198793] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:32.829 [2024-06-10 15:50:38.198809] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12e2550 00:11:32.829 [2024-06-10 15:50:38.198819] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:32.829 [2024-06-10 15:50:38.199167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:32.829 [2024-06-10 15:50:38.199184] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:32.829 [2024-06-10 15:50:38.199244] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:32.829 [2024-06-10 15:50:38.199262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:32.829 [2024-06-10 15:50:38.199364] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14944a0 00:11:32.829 [2024-06-10 15:50:38.199374] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:32.829 [2024-06-10 15:50:38.199558] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x148dc80 00:11:32.829 [2024-06-10 15:50:38.199689] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14944a0 00:11:32.829 [2024-06-10 15:50:38.199698] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14944a0 00:11:32.829 [2024-06-10 15:50:38.199799] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:32.829 pt2 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:32.829 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:33.158 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:33.158 "name": "raid_bdev1", 00:11:33.158 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:33.158 "strip_size_kb": 0, 00:11:33.158 "state": "online", 00:11:33.158 "raid_level": "raid1", 00:11:33.158 "superblock": true, 00:11:33.158 "num_base_bdevs": 2, 00:11:33.158 "num_base_bdevs_discovered": 2, 00:11:33.158 "num_base_bdevs_operational": 2, 00:11:33.158 "base_bdevs_list": [ 00:11:33.158 { 00:11:33.158 "name": "pt1", 00:11:33.158 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:33.158 "is_configured": true, 00:11:33.158 "data_offset": 2048, 00:11:33.158 "data_size": 63488 00:11:33.158 }, 00:11:33.158 { 00:11:33.158 "name": "pt2", 00:11:33.158 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:33.158 "is_configured": true, 00:11:33.158 "data_offset": 2048, 00:11:33.158 "data_size": 63488 00:11:33.158 } 00:11:33.158 ] 00:11:33.158 }' 00:11:33.158 15:50:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:33.158 15:50:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:33.726 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:11:33.985 [2024-06-10 15:50:39.342033] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:11:33.985 "name": "raid_bdev1", 00:11:33.985 "aliases": [ 00:11:33.985 "b6eee511-235d-4cc9-85e0-33761b061ed2" 00:11:33.985 ], 00:11:33.985 "product_name": "Raid Volume", 00:11:33.985 "block_size": 512, 00:11:33.985 "num_blocks": 63488, 00:11:33.985 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:33.985 "assigned_rate_limits": { 00:11:33.985 "rw_ios_per_sec": 0, 00:11:33.985 "rw_mbytes_per_sec": 0, 00:11:33.985 "r_mbytes_per_sec": 0, 00:11:33.985 "w_mbytes_per_sec": 0 00:11:33.985 }, 00:11:33.985 "claimed": false, 00:11:33.985 "zoned": false, 00:11:33.985 "supported_io_types": { 00:11:33.985 "read": true, 00:11:33.985 "write": true, 00:11:33.985 "unmap": false, 00:11:33.985 "write_zeroes": true, 00:11:33.985 "flush": false, 00:11:33.985 "reset": true, 00:11:33.985 "compare": false, 00:11:33.985 "compare_and_write": false, 00:11:33.985 "abort": false, 00:11:33.985 "nvme_admin": false, 00:11:33.985 "nvme_io": false 00:11:33.985 }, 00:11:33.985 "memory_domains": [ 00:11:33.985 { 00:11:33.985 "dma_device_id": "system", 00:11:33.985 "dma_device_type": 1 00:11:33.985 }, 00:11:33.985 { 00:11:33.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.985 "dma_device_type": 2 00:11:33.985 }, 00:11:33.985 { 00:11:33.985 "dma_device_id": "system", 00:11:33.985 "dma_device_type": 1 00:11:33.985 }, 00:11:33.985 { 00:11:33.985 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:33.985 "dma_device_type": 2 00:11:33.985 } 00:11:33.985 ], 00:11:33.985 "driver_specific": { 00:11:33.985 "raid": { 00:11:33.985 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:33.985 "strip_size_kb": 0, 00:11:33.985 "state": "online", 00:11:33.985 "raid_level": "raid1", 00:11:33.985 "superblock": true, 00:11:33.985 "num_base_bdevs": 2, 00:11:33.985 "num_base_bdevs_discovered": 2, 00:11:33.985 "num_base_bdevs_operational": 2, 00:11:33.985 "base_bdevs_list": [ 00:11:33.985 { 00:11:33.985 "name": "pt1", 00:11:33.985 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:33.985 "is_configured": true, 00:11:33.985 "data_offset": 2048, 00:11:33.985 "data_size": 63488 00:11:33.985 }, 00:11:33.985 { 00:11:33.985 "name": "pt2", 00:11:33.985 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:33.985 "is_configured": true, 00:11:33.985 "data_offset": 2048, 00:11:33.985 "data_size": 63488 00:11:33.985 } 00:11:33.985 ] 00:11:33.985 } 00:11:33.985 } 00:11:33.985 }' 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:11:33.985 pt2' 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:11:33.985 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:34.244 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:34.244 "name": "pt1", 00:11:34.244 "aliases": [ 00:11:34.244 "00000000-0000-0000-0000-000000000001" 00:11:34.244 ], 00:11:34.244 "product_name": "passthru", 00:11:34.244 "block_size": 512, 00:11:34.244 "num_blocks": 65536, 00:11:34.244 "uuid": "00000000-0000-0000-0000-000000000001", 00:11:34.244 "assigned_rate_limits": { 00:11:34.244 "rw_ios_per_sec": 0, 00:11:34.244 "rw_mbytes_per_sec": 0, 00:11:34.244 "r_mbytes_per_sec": 0, 00:11:34.244 "w_mbytes_per_sec": 0 00:11:34.244 }, 00:11:34.244 "claimed": true, 00:11:34.244 "claim_type": "exclusive_write", 00:11:34.244 "zoned": false, 00:11:34.244 "supported_io_types": { 00:11:34.244 "read": true, 00:11:34.244 "write": true, 00:11:34.244 "unmap": true, 00:11:34.244 "write_zeroes": true, 00:11:34.244 "flush": true, 00:11:34.244 "reset": true, 00:11:34.244 "compare": false, 00:11:34.244 "compare_and_write": false, 00:11:34.244 "abort": true, 00:11:34.244 "nvme_admin": false, 00:11:34.244 "nvme_io": false 00:11:34.244 }, 00:11:34.244 "memory_domains": [ 00:11:34.244 { 00:11:34.244 "dma_device_id": "system", 00:11:34.244 "dma_device_type": 1 00:11:34.244 }, 00:11:34.244 { 00:11:34.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:34.244 "dma_device_type": 2 00:11:34.244 } 00:11:34.244 ], 00:11:34.244 "driver_specific": { 00:11:34.244 "passthru": { 00:11:34.244 "name": "pt1", 00:11:34.244 "base_bdev_name": "malloc1" 00:11:34.244 } 00:11:34.244 } 00:11:34.244 }' 00:11:34.244 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.244 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.504 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:34.505 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:34.505 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.505 15:50:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:34.764 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:34.764 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:11:34.764 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:11:34.764 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:11:35.023 "name": "pt2", 00:11:35.023 "aliases": [ 00:11:35.023 "00000000-0000-0000-0000-000000000002" 00:11:35.023 ], 00:11:35.023 "product_name": "passthru", 00:11:35.023 "block_size": 512, 00:11:35.023 "num_blocks": 65536, 00:11:35.023 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:35.023 "assigned_rate_limits": { 00:11:35.023 "rw_ios_per_sec": 0, 00:11:35.023 "rw_mbytes_per_sec": 0, 00:11:35.023 "r_mbytes_per_sec": 0, 00:11:35.023 "w_mbytes_per_sec": 0 00:11:35.023 }, 00:11:35.023 "claimed": true, 00:11:35.023 "claim_type": "exclusive_write", 00:11:35.023 "zoned": false, 00:11:35.023 "supported_io_types": { 00:11:35.023 "read": true, 00:11:35.023 "write": true, 00:11:35.023 "unmap": true, 00:11:35.023 "write_zeroes": true, 00:11:35.023 "flush": true, 00:11:35.023 "reset": true, 00:11:35.023 "compare": false, 00:11:35.023 "compare_and_write": false, 00:11:35.023 "abort": true, 00:11:35.023 "nvme_admin": false, 00:11:35.023 "nvme_io": false 00:11:35.023 }, 00:11:35.023 "memory_domains": [ 00:11:35.023 { 00:11:35.023 "dma_device_id": "system", 00:11:35.023 "dma_device_type": 1 00:11:35.023 }, 00:11:35.023 { 00:11:35.023 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:35.023 "dma_device_type": 2 00:11:35.023 } 00:11:35.023 ], 00:11:35.023 "driver_specific": { 00:11:35.023 "passthru": { 00:11:35.023 "name": "pt2", 00:11:35.023 "base_bdev_name": "malloc2" 00:11:35.023 } 00:11:35.023 } 00:11:35.023 }' 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:35.023 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:35.282 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:11:35.541 [2024-06-10 15:50:40.890196] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:35.541 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' b6eee511-235d-4cc9-85e0-33761b061ed2 '!=' b6eee511-235d-4cc9-85e0-33761b061ed2 ']' 00:11:35.541 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:11:35.541 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:35.541 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:35.541 15:50:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:11:35.800 [2024-06-10 15:50:41.146677] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:35.800 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:36.059 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:36.059 "name": "raid_bdev1", 00:11:36.059 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:36.059 "strip_size_kb": 0, 00:11:36.059 "state": "online", 00:11:36.059 "raid_level": "raid1", 00:11:36.059 "superblock": true, 00:11:36.059 "num_base_bdevs": 2, 00:11:36.059 "num_base_bdevs_discovered": 1, 00:11:36.059 "num_base_bdevs_operational": 1, 00:11:36.059 "base_bdevs_list": [ 00:11:36.059 { 00:11:36.059 "name": null, 00:11:36.059 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:36.059 "is_configured": false, 00:11:36.059 "data_offset": 2048, 00:11:36.059 "data_size": 63488 00:11:36.059 }, 00:11:36.059 { 00:11:36.059 "name": "pt2", 00:11:36.059 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:36.059 "is_configured": true, 00:11:36.059 "data_offset": 2048, 00:11:36.059 "data_size": 63488 00:11:36.059 } 00:11:36.059 ] 00:11:36.059 }' 00:11:36.059 15:50:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:36.059 15:50:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:36.626 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:36.884 [2024-06-10 15:50:42.289723] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:36.884 [2024-06-10 15:50:42.289749] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:36.884 [2024-06-10 15:50:42.289801] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:36.884 [2024-06-10 15:50:42.289842] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:36.884 [2024-06-10 15:50:42.289851] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14944a0 name raid_bdev1, state offline 00:11:36.884 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:11:36.884 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.143 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:11:37.143 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:11:37.143 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:11:37.143 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:37.143 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=1 00:11:37.402 15:50:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:11:37.661 [2024-06-10 15:50:43.071772] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:11:37.661 [2024-06-10 15:50:43.071815] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:37.661 [2024-06-10 15:50:43.071832] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1494250 00:11:37.661 [2024-06-10 15:50:43.071842] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:37.661 [2024-06-10 15:50:43.073528] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:37.661 [2024-06-10 15:50:43.073555] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:11:37.661 [2024-06-10 15:50:43.073617] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:11:37.661 [2024-06-10 15:50:43.073640] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:37.661 [2024-06-10 15:50:43.073722] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12e1b00 00:11:37.661 [2024-06-10 15:50:43.073730] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:37.661 [2024-06-10 15:50:43.073907] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e2cf0 00:11:37.661 [2024-06-10 15:50:43.074050] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12e1b00 00:11:37.661 [2024-06-10 15:50:43.074059] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12e1b00 00:11:37.661 [2024-06-10 15:50:43.074161] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:37.661 pt2 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:37.661 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:37.920 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:37.920 "name": "raid_bdev1", 00:11:37.920 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:37.920 "strip_size_kb": 0, 00:11:37.920 "state": "online", 00:11:37.920 "raid_level": "raid1", 00:11:37.920 "superblock": true, 00:11:37.920 "num_base_bdevs": 2, 00:11:37.920 "num_base_bdevs_discovered": 1, 00:11:37.920 "num_base_bdevs_operational": 1, 00:11:37.920 "base_bdevs_list": [ 00:11:37.920 { 00:11:37.920 "name": null, 00:11:37.920 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:37.920 "is_configured": false, 00:11:37.920 "data_offset": 2048, 00:11:37.920 "data_size": 63488 00:11:37.920 }, 00:11:37.920 { 00:11:37.920 "name": "pt2", 00:11:37.920 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:37.920 "is_configured": true, 00:11:37.920 "data_offset": 2048, 00:11:37.920 "data_size": 63488 00:11:37.920 } 00:11:37.920 ] 00:11:37.920 }' 00:11:37.920 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:37.920 15:50:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:38.488 15:50:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:38.747 [2024-06-10 15:50:44.214832] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:38.747 [2024-06-10 15:50:44.214857] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:38.747 [2024-06-10 15:50:44.214914] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:38.747 [2024-06-10 15:50:44.214967] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:38.747 [2024-06-10 15:50:44.214979] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12e1b00 name raid_bdev1, state offline 00:11:38.747 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:38.747 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:11:39.006 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:11:39.006 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:11:39.006 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:11:39.006 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:11:39.265 [2024-06-10 15:50:44.728191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:11:39.265 [2024-06-10 15:50:44.728240] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:39.265 [2024-06-10 15:50:44.728255] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x148c3a0 00:11:39.265 [2024-06-10 15:50:44.728265] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:39.265 [2024-06-10 15:50:44.729963] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:39.265 [2024-06-10 15:50:44.729991] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:11:39.265 [2024-06-10 15:50:44.730058] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:11:39.265 [2024-06-10 15:50:44.730083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:11:39.265 [2024-06-10 15:50:44.730187] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:11:39.265 [2024-06-10 15:50:44.730198] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:39.265 [2024-06-10 15:50:44.730209] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14908e0 name raid_bdev1, state configuring 00:11:39.265 [2024-06-10 15:50:44.730230] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:11:39.265 [2024-06-10 15:50:44.730289] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1493c40 00:11:39.265 [2024-06-10 15:50:44.730298] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:39.265 [2024-06-10 15:50:44.730471] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12e3c70 00:11:39.265 [2024-06-10 15:50:44.730598] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1493c40 00:11:39.265 [2024-06-10 15:50:44.730607] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1493c40 00:11:39.265 [2024-06-10 15:50:44.730709] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:39.265 pt1 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:39.265 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:39.266 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:39.525 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:39.525 "name": "raid_bdev1", 00:11:39.525 "uuid": "b6eee511-235d-4cc9-85e0-33761b061ed2", 00:11:39.525 "strip_size_kb": 0, 00:11:39.525 "state": "online", 00:11:39.525 "raid_level": "raid1", 00:11:39.525 "superblock": true, 00:11:39.525 "num_base_bdevs": 2, 00:11:39.525 "num_base_bdevs_discovered": 1, 00:11:39.525 "num_base_bdevs_operational": 1, 00:11:39.525 "base_bdevs_list": [ 00:11:39.525 { 00:11:39.525 "name": null, 00:11:39.525 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:39.525 "is_configured": false, 00:11:39.525 "data_offset": 2048, 00:11:39.525 "data_size": 63488 00:11:39.525 }, 00:11:39.525 { 00:11:39.525 "name": "pt2", 00:11:39.525 "uuid": "00000000-0000-0000-0000-000000000002", 00:11:39.525 "is_configured": true, 00:11:39.525 "data_offset": 2048, 00:11:39.525 "data_size": 63488 00:11:39.525 } 00:11:39.525 ] 00:11:39.525 }' 00:11:39.525 15:50:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:39.525 15:50:44 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.093 15:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:11:40.351 15:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:11:40.351 15:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:11:40.351 15:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:11:40.351 15:50:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:11:40.609 [2024-06-10 15:50:46.092033] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' b6eee511-235d-4cc9-85e0-33761b061ed2 '!=' b6eee511-235d-4cc9-85e0-33761b061ed2 ']' 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2651027 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2651027 ']' 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2651027 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:11:40.609 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2651027 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2651027' 00:11:40.868 killing process with pid 2651027 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2651027 00:11:40.868 [2024-06-10 15:50:46.156042] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:40.868 [2024-06-10 15:50:46.156092] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:40.868 [2024-06-10 15:50:46.156133] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:40.868 [2024-06-10 15:50:46.156141] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1493c40 name raid_bdev1, state offline 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2651027 00:11:40.868 [2024-06-10 15:50:46.172495] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:11:40.868 00:11:40.868 real 0m16.225s 00:11:40.868 user 0m30.195s 00:11:40.868 sys 0m2.279s 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:40.868 15:50:46 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:11:40.868 ************************************ 00:11:40.868 END TEST raid_superblock_test 00:11:40.868 ************************************ 00:11:41.127 15:50:46 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 2 read 00:11:41.127 15:50:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:41.127 15:50:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:41.127 15:50:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:41.127 ************************************ 00:11:41.127 START TEST raid_read_error_test 00:11:41.127 ************************************ 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 read 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.FE7FYvJFin 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2654074 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2654074 /var/tmp/spdk-raid.sock 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2654074 ']' 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:41.127 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:41.127 15:50:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:41.127 [2024-06-10 15:50:46.497720] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:41.127 [2024-06-10 15:50:46.497772] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2654074 ] 00:11:41.127 [2024-06-10 15:50:46.593962] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:41.386 [2024-06-10 15:50:46.687975] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:41.386 [2024-06-10 15:50:46.748035] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.386 [2024-06-10 15:50:46.748072] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:41.954 15:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:41.954 15:50:47 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:41.954 15:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:41.954 15:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:42.213 BaseBdev1_malloc 00:11:42.213 15:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:42.472 true 00:11:42.472 15:50:47 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:42.731 [2024-06-10 15:50:48.182275] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:42.731 [2024-06-10 15:50:48.182316] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:42.731 [2024-06-10 15:50:48.182334] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb4d150 00:11:42.731 [2024-06-10 15:50:48.182343] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:42.731 [2024-06-10 15:50:48.184152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:42.731 [2024-06-10 15:50:48.184182] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:42.731 BaseBdev1 00:11:42.731 15:50:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:42.731 15:50:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:42.991 BaseBdev2_malloc 00:11:42.991 15:50:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:43.250 true 00:11:43.250 15:50:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:43.509 [2024-06-10 15:50:48.948885] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:43.509 [2024-06-10 15:50:48.948926] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:43.509 [2024-06-10 15:50:48.948944] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xb51b50 00:11:43.509 [2024-06-10 15:50:48.948954] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:43.509 [2024-06-10 15:50:48.950545] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:43.509 [2024-06-10 15:50:48.950573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:43.509 BaseBdev2 00:11:43.509 15:50:48 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:43.767 [2024-06-10 15:50:49.201581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:43.767 [2024-06-10 15:50:49.202944] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:43.767 [2024-06-10 15:50:49.203139] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xb52e30 00:11:43.767 [2024-06-10 15:50:49.203152] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:43.767 [2024-06-10 15:50:49.203347] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb53110 00:11:43.767 [2024-06-10 15:50:49.203501] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xb52e30 00:11:43.767 [2024-06-10 15:50:49.203510] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xb52e30 00:11:43.767 [2024-06-10 15:50:49.203617] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:43.767 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:44.025 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:44.025 "name": "raid_bdev1", 00:11:44.025 "uuid": "d6738701-7a12-4243-aa5e-ec791380f214", 00:11:44.025 "strip_size_kb": 0, 00:11:44.025 "state": "online", 00:11:44.025 "raid_level": "raid1", 00:11:44.025 "superblock": true, 00:11:44.025 "num_base_bdevs": 2, 00:11:44.025 "num_base_bdevs_discovered": 2, 00:11:44.025 "num_base_bdevs_operational": 2, 00:11:44.025 "base_bdevs_list": [ 00:11:44.025 { 00:11:44.025 "name": "BaseBdev1", 00:11:44.025 "uuid": "a9ece139-6a8d-5e6e-8e43-b545b1ece4c5", 00:11:44.025 "is_configured": true, 00:11:44.025 "data_offset": 2048, 00:11:44.025 "data_size": 63488 00:11:44.025 }, 00:11:44.025 { 00:11:44.025 "name": "BaseBdev2", 00:11:44.025 "uuid": "b01abbf7-9cbb-5066-aea7-de7e912d7dca", 00:11:44.025 "is_configured": true, 00:11:44.025 "data_offset": 2048, 00:11:44.025 "data_size": 63488 00:11:44.025 } 00:11:44.025 ] 00:11:44.025 }' 00:11:44.025 15:50:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:44.025 15:50:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:44.592 15:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:44.592 15:50:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:44.850 [2024-06-10 15:50:50.192483] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb4e380 00:11:45.787 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=2 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:46.047 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:46.307 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:46.307 "name": "raid_bdev1", 00:11:46.307 "uuid": "d6738701-7a12-4243-aa5e-ec791380f214", 00:11:46.307 "strip_size_kb": 0, 00:11:46.307 "state": "online", 00:11:46.307 "raid_level": "raid1", 00:11:46.307 "superblock": true, 00:11:46.307 "num_base_bdevs": 2, 00:11:46.307 "num_base_bdevs_discovered": 2, 00:11:46.307 "num_base_bdevs_operational": 2, 00:11:46.307 "base_bdevs_list": [ 00:11:46.307 { 00:11:46.307 "name": "BaseBdev1", 00:11:46.307 "uuid": "a9ece139-6a8d-5e6e-8e43-b545b1ece4c5", 00:11:46.307 "is_configured": true, 00:11:46.307 "data_offset": 2048, 00:11:46.307 "data_size": 63488 00:11:46.307 }, 00:11:46.307 { 00:11:46.307 "name": "BaseBdev2", 00:11:46.307 "uuid": "b01abbf7-9cbb-5066-aea7-de7e912d7dca", 00:11:46.307 "is_configured": true, 00:11:46.307 "data_offset": 2048, 00:11:46.307 "data_size": 63488 00:11:46.307 } 00:11:46.307 ] 00:11:46.307 }' 00:11:46.307 15:50:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:46.307 15:50:51 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:46.874 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:47.131 [2024-06-10 15:50:52.477561] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:47.132 [2024-06-10 15:50:52.477589] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:47.132 [2024-06-10 15:50:52.480975] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:47.132 [2024-06-10 15:50:52.481005] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:47.132 [2024-06-10 15:50:52.481083] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:47.132 [2024-06-10 15:50:52.481091] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xb52e30 name raid_bdev1, state offline 00:11:47.132 0 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2654074 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2654074 ']' 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2654074 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2654074 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2654074' 00:11:47.132 killing process with pid 2654074 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2654074 00:11:47.132 [2024-06-10 15:50:52.546439] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:47.132 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2654074 00:11:47.132 [2024-06-10 15:50:52.557067] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.FE7FYvJFin 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:47.432 00:11:47.432 real 0m6.343s 00:11:47.432 user 0m10.181s 00:11:47.432 sys 0m0.919s 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:47.432 15:50:52 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.432 ************************************ 00:11:47.432 END TEST raid_read_error_test 00:11:47.432 ************************************ 00:11:47.432 15:50:52 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 2 write 00:11:47.432 15:50:52 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:47.432 15:50:52 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:47.432 15:50:52 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:47.432 ************************************ 00:11:47.432 START TEST raid_write_error_test 00:11:47.432 ************************************ 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 2 write 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=2 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.26ZI2Xj7C0 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2655303 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2655303 /var/tmp/spdk-raid.sock 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:11:47.432 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2655303 ']' 00:11:47.433 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:47.433 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:47.433 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:47.433 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:47.433 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:47.433 15:50:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:47.433 [2024-06-10 15:50:52.891791] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:47.433 [2024-06-10 15:50:52.891845] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2655303 ] 00:11:47.710 [2024-06-10 15:50:52.991122] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:47.710 [2024-06-10 15:50:53.085735] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:47.710 [2024-06-10 15:50:53.140851] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:47.710 [2024-06-10 15:50:53.140878] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:48.647 15:50:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:48.647 15:50:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:11:48.647 15:50:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:48.647 15:50:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:11:48.647 BaseBdev1_malloc 00:11:48.647 15:50:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:11:48.906 true 00:11:48.906 15:50:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:11:49.165 [2024-06-10 15:50:54.594300] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:11:49.165 [2024-06-10 15:50:54.594342] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:49.165 [2024-06-10 15:50:54.594360] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x247d150 00:11:49.165 [2024-06-10 15:50:54.594370] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:49.165 [2024-06-10 15:50:54.596201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:49.165 [2024-06-10 15:50:54.596231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:11:49.165 BaseBdev1 00:11:49.165 15:50:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:11:49.165 15:50:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:11:49.424 BaseBdev2_malloc 00:11:49.424 15:50:54 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:11:49.683 true 00:11:49.683 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:11:49.942 [2024-06-10 15:50:55.352879] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:11:49.943 [2024-06-10 15:50:55.352919] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:11:49.943 [2024-06-10 15:50:55.352937] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2481b50 00:11:49.943 [2024-06-10 15:50:55.352947] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:11:49.943 [2024-06-10 15:50:55.354534] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:11:49.943 [2024-06-10 15:50:55.354560] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:11:49.943 BaseBdev2 00:11:49.943 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 -s 00:11:50.201 [2024-06-10 15:50:55.605581] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:50.201 [2024-06-10 15:50:55.606933] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:50.202 [2024-06-10 15:50:55.607134] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2482e30 00:11:50.202 [2024-06-10 15:50:55.607147] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:11:50.202 [2024-06-10 15:50:55.607339] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2483110 00:11:50.202 [2024-06-10 15:50:55.607494] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2482e30 00:11:50.202 [2024-06-10 15:50:55.607503] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2482e30 00:11:50.202 [2024-06-10 15:50:55.607608] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:50.202 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:50.460 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:50.461 "name": "raid_bdev1", 00:11:50.461 "uuid": "4cb82f7a-b819-4e09-954a-235254332a9f", 00:11:50.461 "strip_size_kb": 0, 00:11:50.461 "state": "online", 00:11:50.461 "raid_level": "raid1", 00:11:50.461 "superblock": true, 00:11:50.461 "num_base_bdevs": 2, 00:11:50.461 "num_base_bdevs_discovered": 2, 00:11:50.461 "num_base_bdevs_operational": 2, 00:11:50.461 "base_bdevs_list": [ 00:11:50.461 { 00:11:50.461 "name": "BaseBdev1", 00:11:50.461 "uuid": "d4115833-f774-512d-9272-e28407520376", 00:11:50.461 "is_configured": true, 00:11:50.461 "data_offset": 2048, 00:11:50.461 "data_size": 63488 00:11:50.461 }, 00:11:50.461 { 00:11:50.461 "name": "BaseBdev2", 00:11:50.461 "uuid": "d780ebc3-e83f-5907-92b9-114dc23b1f1a", 00:11:50.461 "is_configured": true, 00:11:50.461 "data_offset": 2048, 00:11:50.461 "data_size": 63488 00:11:50.461 } 00:11:50.461 ] 00:11:50.461 }' 00:11:50.461 15:50:55 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:50.461 15:50:55 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:51.028 15:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:11:51.028 15:50:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:11:51.287 [2024-06-10 15:50:56.616515] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x247e380 00:11:52.225 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:11:52.483 [2024-06-10 15:50:57.740860] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:11:52.483 [2024-06-10 15:50:57.740917] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:11:52.483 [2024-06-10 15:50:57.741093] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x247e380 00:11:52.483 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=1 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:52.484 15:50:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:11:52.743 15:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:52.743 "name": "raid_bdev1", 00:11:52.743 "uuid": "4cb82f7a-b819-4e09-954a-235254332a9f", 00:11:52.743 "strip_size_kb": 0, 00:11:52.743 "state": "online", 00:11:52.743 "raid_level": "raid1", 00:11:52.743 "superblock": true, 00:11:52.743 "num_base_bdevs": 2, 00:11:52.743 "num_base_bdevs_discovered": 1, 00:11:52.743 "num_base_bdevs_operational": 1, 00:11:52.743 "base_bdevs_list": [ 00:11:52.743 { 00:11:52.743 "name": null, 00:11:52.743 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:52.743 "is_configured": false, 00:11:52.743 "data_offset": 2048, 00:11:52.743 "data_size": 63488 00:11:52.743 }, 00:11:52.743 { 00:11:52.743 "name": "BaseBdev2", 00:11:52.743 "uuid": "d780ebc3-e83f-5907-92b9-114dc23b1f1a", 00:11:52.743 "is_configured": true, 00:11:52.743 "data_offset": 2048, 00:11:52.743 "data_size": 63488 00:11:52.743 } 00:11:52.743 ] 00:11:52.743 }' 00:11:52.743 15:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:52.743 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.310 15:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:11:53.568 [2024-06-10 15:50:58.885184] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:11:53.568 [2024-06-10 15:50:58.885221] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:11:53.568 [2024-06-10 15:50:58.888594] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:11:53.568 [2024-06-10 15:50:58.888622] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:11:53.568 [2024-06-10 15:50:58.888675] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:11:53.568 [2024-06-10 15:50:58.888689] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2482e30 name raid_bdev1, state offline 00:11:53.569 0 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2655303 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2655303 ']' 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2655303 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2655303 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2655303' 00:11:53.569 killing process with pid 2655303 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2655303 00:11:53.569 [2024-06-10 15:50:58.948399] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:11:53.569 15:50:58 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2655303 00:11:53.569 [2024-06-10 15:50:58.957896] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.26ZI2Xj7C0 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:11:53.827 00:11:53.827 real 0m6.343s 00:11:53.827 user 0m10.187s 00:11:53.827 sys 0m0.916s 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:11:53.827 15:50:59 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.827 ************************************ 00:11:53.827 END TEST raid_write_error_test 00:11:53.827 ************************************ 00:11:53.827 15:50:59 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:11:53.827 15:50:59 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:11:53.827 15:50:59 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 3 false 00:11:53.828 15:50:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:11:53.828 15:50:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:11:53.828 15:50:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:11:53.828 ************************************ 00:11:53.828 START TEST raid_state_function_test 00:11:53.828 ************************************ 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 false 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2656326 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2656326' 00:11:53.828 Process raid pid: 2656326 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2656326 /var/tmp/spdk-raid.sock 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2656326 ']' 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:11:53.828 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:11:53.828 15:50:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:53.828 [2024-06-10 15:50:59.286319] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:11:53.828 [2024-06-10 15:50:59.286371] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:11:54.087 [2024-06-10 15:50:59.385794] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:11:54.087 [2024-06-10 15:50:59.480347] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:11:54.087 [2024-06-10 15:50:59.543345] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:54.087 [2024-06-10 15:50:59.543377] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:55.023 [2024-06-10 15:51:00.471395] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:55.023 [2024-06-10 15:51:00.471438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:55.023 [2024-06-10 15:51:00.471448] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:55.023 [2024-06-10 15:51:00.471457] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:55.023 [2024-06-10 15:51:00.471464] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:55.023 [2024-06-10 15:51:00.471472] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:55.023 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:55.282 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:55.282 "name": "Existed_Raid", 00:11:55.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.282 "strip_size_kb": 64, 00:11:55.282 "state": "configuring", 00:11:55.282 "raid_level": "raid0", 00:11:55.282 "superblock": false, 00:11:55.282 "num_base_bdevs": 3, 00:11:55.282 "num_base_bdevs_discovered": 0, 00:11:55.282 "num_base_bdevs_operational": 3, 00:11:55.282 "base_bdevs_list": [ 00:11:55.282 { 00:11:55.282 "name": "BaseBdev1", 00:11:55.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.282 "is_configured": false, 00:11:55.282 "data_offset": 0, 00:11:55.282 "data_size": 0 00:11:55.282 }, 00:11:55.282 { 00:11:55.282 "name": "BaseBdev2", 00:11:55.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.282 "is_configured": false, 00:11:55.282 "data_offset": 0, 00:11:55.282 "data_size": 0 00:11:55.282 }, 00:11:55.282 { 00:11:55.282 "name": "BaseBdev3", 00:11:55.282 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:55.282 "is_configured": false, 00:11:55.282 "data_offset": 0, 00:11:55.282 "data_size": 0 00:11:55.282 } 00:11:55.282 ] 00:11:55.282 }' 00:11:55.282 15:51:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:55.282 15:51:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:56.218 15:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:56.218 [2024-06-10 15:51:01.610306] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:56.218 [2024-06-10 15:51:01.610336] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21db120 name Existed_Raid, state configuring 00:11:56.218 15:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:56.476 [2024-06-10 15:51:01.867006] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:11:56.476 [2024-06-10 15:51:01.867036] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:11:56.476 [2024-06-10 15:51:01.867045] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:56.476 [2024-06-10 15:51:01.867053] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:56.476 [2024-06-10 15:51:01.867060] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:56.476 [2024-06-10 15:51:01.867068] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:56.476 15:51:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:11:56.735 [2024-06-10 15:51:02.133312] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:56.735 BaseBdev1 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:56.735 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:56.993 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:11:57.252 [ 00:11:57.252 { 00:11:57.252 "name": "BaseBdev1", 00:11:57.252 "aliases": [ 00:11:57.252 "89aa35e7-fe3b-4c6c-a114-0b39549813fc" 00:11:57.252 ], 00:11:57.252 "product_name": "Malloc disk", 00:11:57.252 "block_size": 512, 00:11:57.252 "num_blocks": 65536, 00:11:57.252 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:11:57.252 "assigned_rate_limits": { 00:11:57.252 "rw_ios_per_sec": 0, 00:11:57.252 "rw_mbytes_per_sec": 0, 00:11:57.252 "r_mbytes_per_sec": 0, 00:11:57.252 "w_mbytes_per_sec": 0 00:11:57.252 }, 00:11:57.252 "claimed": true, 00:11:57.252 "claim_type": "exclusive_write", 00:11:57.252 "zoned": false, 00:11:57.252 "supported_io_types": { 00:11:57.252 "read": true, 00:11:57.252 "write": true, 00:11:57.252 "unmap": true, 00:11:57.252 "write_zeroes": true, 00:11:57.252 "flush": true, 00:11:57.252 "reset": true, 00:11:57.252 "compare": false, 00:11:57.252 "compare_and_write": false, 00:11:57.252 "abort": true, 00:11:57.252 "nvme_admin": false, 00:11:57.252 "nvme_io": false 00:11:57.252 }, 00:11:57.252 "memory_domains": [ 00:11:57.252 { 00:11:57.252 "dma_device_id": "system", 00:11:57.252 "dma_device_type": 1 00:11:57.252 }, 00:11:57.252 { 00:11:57.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:11:57.252 "dma_device_type": 2 00:11:57.252 } 00:11:57.252 ], 00:11:57.252 "driver_specific": {} 00:11:57.252 } 00:11:57.252 ] 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:57.252 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:57.510 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:57.510 "name": "Existed_Raid", 00:11:57.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.511 "strip_size_kb": 64, 00:11:57.511 "state": "configuring", 00:11:57.511 "raid_level": "raid0", 00:11:57.511 "superblock": false, 00:11:57.511 "num_base_bdevs": 3, 00:11:57.511 "num_base_bdevs_discovered": 1, 00:11:57.511 "num_base_bdevs_operational": 3, 00:11:57.511 "base_bdevs_list": [ 00:11:57.511 { 00:11:57.511 "name": "BaseBdev1", 00:11:57.511 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:11:57.511 "is_configured": true, 00:11:57.511 "data_offset": 0, 00:11:57.511 "data_size": 65536 00:11:57.511 }, 00:11:57.511 { 00:11:57.511 "name": "BaseBdev2", 00:11:57.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.511 "is_configured": false, 00:11:57.511 "data_offset": 0, 00:11:57.511 "data_size": 0 00:11:57.511 }, 00:11:57.511 { 00:11:57.511 "name": "BaseBdev3", 00:11:57.511 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:57.511 "is_configured": false, 00:11:57.511 "data_offset": 0, 00:11:57.511 "data_size": 0 00:11:57.511 } 00:11:57.511 ] 00:11:57.511 }' 00:11:57.511 15:51:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:57.511 15:51:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:58.078 15:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:11:58.336 [2024-06-10 15:51:03.773706] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:11:58.336 [2024-06-10 15:51:03.773744] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21da9b0 name Existed_Raid, state configuring 00:11:58.336 15:51:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:11:58.595 [2024-06-10 15:51:04.030427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:11:58.595 [2024-06-10 15:51:04.031953] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:11:58.595 [2024-06-10 15:51:04.031991] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:11:58.595 [2024-06-10 15:51:04.031999] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:11:58.595 [2024-06-10 15:51:04.032007] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:11:58.595 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:11:58.853 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:11:58.853 "name": "Existed_Raid", 00:11:58.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.853 "strip_size_kb": 64, 00:11:58.853 "state": "configuring", 00:11:58.853 "raid_level": "raid0", 00:11:58.853 "superblock": false, 00:11:58.853 "num_base_bdevs": 3, 00:11:58.853 "num_base_bdevs_discovered": 1, 00:11:58.853 "num_base_bdevs_operational": 3, 00:11:58.853 "base_bdevs_list": [ 00:11:58.853 { 00:11:58.853 "name": "BaseBdev1", 00:11:58.853 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:11:58.853 "is_configured": true, 00:11:58.853 "data_offset": 0, 00:11:58.853 "data_size": 65536 00:11:58.853 }, 00:11:58.853 { 00:11:58.853 "name": "BaseBdev2", 00:11:58.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.853 "is_configured": false, 00:11:58.853 "data_offset": 0, 00:11:58.853 "data_size": 0 00:11:58.853 }, 00:11:58.853 { 00:11:58.853 "name": "BaseBdev3", 00:11:58.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:11:58.853 "is_configured": false, 00:11:58.853 "data_offset": 0, 00:11:58.853 "data_size": 0 00:11:58.853 } 00:11:58.853 ] 00:11:58.853 }' 00:11:58.853 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:11:58.853 15:51:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:11:59.420 15:51:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:11:59.679 [2024-06-10 15:51:05.152818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:11:59.679 BaseBdev2 00:11:59.679 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:11:59.679 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:11:59.680 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:11:59.680 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:11:59.680 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:11:59.680 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:11:59.680 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:11:59.939 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:00.198 [ 00:12:00.198 { 00:12:00.198 "name": "BaseBdev2", 00:12:00.198 "aliases": [ 00:12:00.198 "d485ef27-a90e-461c-b571-0b22f6f7c60a" 00:12:00.198 ], 00:12:00.198 "product_name": "Malloc disk", 00:12:00.198 "block_size": 512, 00:12:00.198 "num_blocks": 65536, 00:12:00.198 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:00.198 "assigned_rate_limits": { 00:12:00.198 "rw_ios_per_sec": 0, 00:12:00.198 "rw_mbytes_per_sec": 0, 00:12:00.198 "r_mbytes_per_sec": 0, 00:12:00.198 "w_mbytes_per_sec": 0 00:12:00.198 }, 00:12:00.198 "claimed": true, 00:12:00.198 "claim_type": "exclusive_write", 00:12:00.198 "zoned": false, 00:12:00.198 "supported_io_types": { 00:12:00.198 "read": true, 00:12:00.198 "write": true, 00:12:00.198 "unmap": true, 00:12:00.198 "write_zeroes": true, 00:12:00.198 "flush": true, 00:12:00.198 "reset": true, 00:12:00.198 "compare": false, 00:12:00.198 "compare_and_write": false, 00:12:00.198 "abort": true, 00:12:00.198 "nvme_admin": false, 00:12:00.198 "nvme_io": false 00:12:00.198 }, 00:12:00.198 "memory_domains": [ 00:12:00.198 { 00:12:00.198 "dma_device_id": "system", 00:12:00.198 "dma_device_type": 1 00:12:00.198 }, 00:12:00.198 { 00:12:00.198 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:00.198 "dma_device_type": 2 00:12:00.198 } 00:12:00.198 ], 00:12:00.198 "driver_specific": {} 00:12:00.198 } 00:12:00.198 ] 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:00.198 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:00.457 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:00.457 "name": "Existed_Raid", 00:12:00.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.457 "strip_size_kb": 64, 00:12:00.457 "state": "configuring", 00:12:00.457 "raid_level": "raid0", 00:12:00.457 "superblock": false, 00:12:00.457 "num_base_bdevs": 3, 00:12:00.457 "num_base_bdevs_discovered": 2, 00:12:00.457 "num_base_bdevs_operational": 3, 00:12:00.457 "base_bdevs_list": [ 00:12:00.457 { 00:12:00.457 "name": "BaseBdev1", 00:12:00.457 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:12:00.457 "is_configured": true, 00:12:00.457 "data_offset": 0, 00:12:00.457 "data_size": 65536 00:12:00.457 }, 00:12:00.457 { 00:12:00.457 "name": "BaseBdev2", 00:12:00.457 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:00.457 "is_configured": true, 00:12:00.457 "data_offset": 0, 00:12:00.457 "data_size": 65536 00:12:00.457 }, 00:12:00.457 { 00:12:00.457 "name": "BaseBdev3", 00:12:00.457 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:00.457 "is_configured": false, 00:12:00.457 "data_offset": 0, 00:12:00.457 "data_size": 0 00:12:00.457 } 00:12:00.457 ] 00:12:00.457 }' 00:12:00.457 15:51:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:00.457 15:51:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:01.026 15:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:01.285 [2024-06-10 15:51:06.716360] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:01.285 [2024-06-10 15:51:06.716395] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21db8c0 00:12:01.285 [2024-06-10 15:51:06.716401] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:01.285 [2024-06-10 15:51:06.716597] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x21f2830 00:12:01.285 [2024-06-10 15:51:06.716723] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21db8c0 00:12:01.285 [2024-06-10 15:51:06.716731] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21db8c0 00:12:01.285 [2024-06-10 15:51:06.716898] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:01.285 BaseBdev3 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:01.285 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:01.544 15:51:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:01.803 [ 00:12:01.803 { 00:12:01.803 "name": "BaseBdev3", 00:12:01.803 "aliases": [ 00:12:01.803 "8bf6d709-cca9-4ea1-997d-d77e341ee839" 00:12:01.803 ], 00:12:01.803 "product_name": "Malloc disk", 00:12:01.803 "block_size": 512, 00:12:01.803 "num_blocks": 65536, 00:12:01.803 "uuid": "8bf6d709-cca9-4ea1-997d-d77e341ee839", 00:12:01.803 "assigned_rate_limits": { 00:12:01.803 "rw_ios_per_sec": 0, 00:12:01.803 "rw_mbytes_per_sec": 0, 00:12:01.803 "r_mbytes_per_sec": 0, 00:12:01.803 "w_mbytes_per_sec": 0 00:12:01.803 }, 00:12:01.803 "claimed": true, 00:12:01.803 "claim_type": "exclusive_write", 00:12:01.803 "zoned": false, 00:12:01.803 "supported_io_types": { 00:12:01.803 "read": true, 00:12:01.803 "write": true, 00:12:01.803 "unmap": true, 00:12:01.803 "write_zeroes": true, 00:12:01.803 "flush": true, 00:12:01.803 "reset": true, 00:12:01.803 "compare": false, 00:12:01.803 "compare_and_write": false, 00:12:01.803 "abort": true, 00:12:01.803 "nvme_admin": false, 00:12:01.803 "nvme_io": false 00:12:01.803 }, 00:12:01.803 "memory_domains": [ 00:12:01.803 { 00:12:01.803 "dma_device_id": "system", 00:12:01.803 "dma_device_type": 1 00:12:01.803 }, 00:12:01.803 { 00:12:01.803 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:01.803 "dma_device_type": 2 00:12:01.803 } 00:12:01.803 ], 00:12:01.803 "driver_specific": {} 00:12:01.803 } 00:12:01.803 ] 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:01.804 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:02.092 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:02.092 "name": "Existed_Raid", 00:12:02.092 "uuid": "8f442748-09ff-4c07-8956-ecb6a5f3d6cf", 00:12:02.092 "strip_size_kb": 64, 00:12:02.092 "state": "online", 00:12:02.092 "raid_level": "raid0", 00:12:02.092 "superblock": false, 00:12:02.092 "num_base_bdevs": 3, 00:12:02.092 "num_base_bdevs_discovered": 3, 00:12:02.092 "num_base_bdevs_operational": 3, 00:12:02.092 "base_bdevs_list": [ 00:12:02.092 { 00:12:02.092 "name": "BaseBdev1", 00:12:02.092 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:12:02.092 "is_configured": true, 00:12:02.092 "data_offset": 0, 00:12:02.092 "data_size": 65536 00:12:02.092 }, 00:12:02.092 { 00:12:02.092 "name": "BaseBdev2", 00:12:02.092 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:02.092 "is_configured": true, 00:12:02.092 "data_offset": 0, 00:12:02.092 "data_size": 65536 00:12:02.092 }, 00:12:02.092 { 00:12:02.092 "name": "BaseBdev3", 00:12:02.092 "uuid": "8bf6d709-cca9-4ea1-997d-d77e341ee839", 00:12:02.092 "is_configured": true, 00:12:02.092 "data_offset": 0, 00:12:02.092 "data_size": 65536 00:12:02.092 } 00:12:02.092 ] 00:12:02.092 }' 00:12:02.092 15:51:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:02.092 15:51:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:02.660 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:02.919 [2024-06-10 15:51:08.240744] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:02.919 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:02.919 "name": "Existed_Raid", 00:12:02.919 "aliases": [ 00:12:02.919 "8f442748-09ff-4c07-8956-ecb6a5f3d6cf" 00:12:02.919 ], 00:12:02.919 "product_name": "Raid Volume", 00:12:02.919 "block_size": 512, 00:12:02.919 "num_blocks": 196608, 00:12:02.919 "uuid": "8f442748-09ff-4c07-8956-ecb6a5f3d6cf", 00:12:02.919 "assigned_rate_limits": { 00:12:02.919 "rw_ios_per_sec": 0, 00:12:02.919 "rw_mbytes_per_sec": 0, 00:12:02.919 "r_mbytes_per_sec": 0, 00:12:02.919 "w_mbytes_per_sec": 0 00:12:02.919 }, 00:12:02.919 "claimed": false, 00:12:02.919 "zoned": false, 00:12:02.919 "supported_io_types": { 00:12:02.919 "read": true, 00:12:02.919 "write": true, 00:12:02.919 "unmap": true, 00:12:02.919 "write_zeroes": true, 00:12:02.919 "flush": true, 00:12:02.919 "reset": true, 00:12:02.919 "compare": false, 00:12:02.919 "compare_and_write": false, 00:12:02.919 "abort": false, 00:12:02.919 "nvme_admin": false, 00:12:02.919 "nvme_io": false 00:12:02.919 }, 00:12:02.919 "memory_domains": [ 00:12:02.919 { 00:12:02.919 "dma_device_id": "system", 00:12:02.919 "dma_device_type": 1 00:12:02.919 }, 00:12:02.919 { 00:12:02.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.919 "dma_device_type": 2 00:12:02.919 }, 00:12:02.919 { 00:12:02.919 "dma_device_id": "system", 00:12:02.919 "dma_device_type": 1 00:12:02.919 }, 00:12:02.919 { 00:12:02.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.919 "dma_device_type": 2 00:12:02.919 }, 00:12:02.919 { 00:12:02.919 "dma_device_id": "system", 00:12:02.919 "dma_device_type": 1 00:12:02.919 }, 00:12:02.919 { 00:12:02.919 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:02.919 "dma_device_type": 2 00:12:02.919 } 00:12:02.919 ], 00:12:02.919 "driver_specific": { 00:12:02.919 "raid": { 00:12:02.919 "uuid": "8f442748-09ff-4c07-8956-ecb6a5f3d6cf", 00:12:02.919 "strip_size_kb": 64, 00:12:02.919 "state": "online", 00:12:02.919 "raid_level": "raid0", 00:12:02.919 "superblock": false, 00:12:02.919 "num_base_bdevs": 3, 00:12:02.919 "num_base_bdevs_discovered": 3, 00:12:02.919 "num_base_bdevs_operational": 3, 00:12:02.919 "base_bdevs_list": [ 00:12:02.919 { 00:12:02.919 "name": "BaseBdev1", 00:12:02.920 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:12:02.920 "is_configured": true, 00:12:02.920 "data_offset": 0, 00:12:02.920 "data_size": 65536 00:12:02.920 }, 00:12:02.920 { 00:12:02.920 "name": "BaseBdev2", 00:12:02.920 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:02.920 "is_configured": true, 00:12:02.920 "data_offset": 0, 00:12:02.920 "data_size": 65536 00:12:02.920 }, 00:12:02.920 { 00:12:02.920 "name": "BaseBdev3", 00:12:02.920 "uuid": "8bf6d709-cca9-4ea1-997d-d77e341ee839", 00:12:02.920 "is_configured": true, 00:12:02.920 "data_offset": 0, 00:12:02.920 "data_size": 65536 00:12:02.920 } 00:12:02.920 ] 00:12:02.920 } 00:12:02.920 } 00:12:02.920 }' 00:12:02.920 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:02.920 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:02.920 BaseBdev2 00:12:02.920 BaseBdev3' 00:12:02.920 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:02.920 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:02.920 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.179 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.179 "name": "BaseBdev1", 00:12:03.179 "aliases": [ 00:12:03.179 "89aa35e7-fe3b-4c6c-a114-0b39549813fc" 00:12:03.179 ], 00:12:03.179 "product_name": "Malloc disk", 00:12:03.179 "block_size": 512, 00:12:03.179 "num_blocks": 65536, 00:12:03.179 "uuid": "89aa35e7-fe3b-4c6c-a114-0b39549813fc", 00:12:03.179 "assigned_rate_limits": { 00:12:03.179 "rw_ios_per_sec": 0, 00:12:03.179 "rw_mbytes_per_sec": 0, 00:12:03.179 "r_mbytes_per_sec": 0, 00:12:03.179 "w_mbytes_per_sec": 0 00:12:03.179 }, 00:12:03.179 "claimed": true, 00:12:03.179 "claim_type": "exclusive_write", 00:12:03.179 "zoned": false, 00:12:03.179 "supported_io_types": { 00:12:03.179 "read": true, 00:12:03.179 "write": true, 00:12:03.179 "unmap": true, 00:12:03.179 "write_zeroes": true, 00:12:03.179 "flush": true, 00:12:03.179 "reset": true, 00:12:03.179 "compare": false, 00:12:03.179 "compare_and_write": false, 00:12:03.179 "abort": true, 00:12:03.179 "nvme_admin": false, 00:12:03.179 "nvme_io": false 00:12:03.179 }, 00:12:03.179 "memory_domains": [ 00:12:03.179 { 00:12:03.179 "dma_device_id": "system", 00:12:03.179 "dma_device_type": 1 00:12:03.179 }, 00:12:03.179 { 00:12:03.179 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.179 "dma_device_type": 2 00:12:03.179 } 00:12:03.179 ], 00:12:03.179 "driver_specific": {} 00:12:03.179 }' 00:12:03.179 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.179 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.179 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.179 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:03.438 15:51:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:03.697 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:03.697 "name": "BaseBdev2", 00:12:03.697 "aliases": [ 00:12:03.697 "d485ef27-a90e-461c-b571-0b22f6f7c60a" 00:12:03.697 ], 00:12:03.697 "product_name": "Malloc disk", 00:12:03.697 "block_size": 512, 00:12:03.697 "num_blocks": 65536, 00:12:03.697 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:03.697 "assigned_rate_limits": { 00:12:03.697 "rw_ios_per_sec": 0, 00:12:03.697 "rw_mbytes_per_sec": 0, 00:12:03.697 "r_mbytes_per_sec": 0, 00:12:03.697 "w_mbytes_per_sec": 0 00:12:03.697 }, 00:12:03.697 "claimed": true, 00:12:03.697 "claim_type": "exclusive_write", 00:12:03.697 "zoned": false, 00:12:03.697 "supported_io_types": { 00:12:03.697 "read": true, 00:12:03.697 "write": true, 00:12:03.697 "unmap": true, 00:12:03.697 "write_zeroes": true, 00:12:03.697 "flush": true, 00:12:03.697 "reset": true, 00:12:03.697 "compare": false, 00:12:03.697 "compare_and_write": false, 00:12:03.697 "abort": true, 00:12:03.697 "nvme_admin": false, 00:12:03.697 "nvme_io": false 00:12:03.697 }, 00:12:03.697 "memory_domains": [ 00:12:03.697 { 00:12:03.697 "dma_device_id": "system", 00:12:03.697 "dma_device_type": 1 00:12:03.697 }, 00:12:03.697 { 00:12:03.697 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:03.697 "dma_device_type": 2 00:12:03.697 } 00:12:03.697 ], 00:12:03.697 "driver_specific": {} 00:12:03.697 }' 00:12:03.697 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:03.956 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.216 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.216 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.216 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:04.216 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:04.216 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:04.475 "name": "BaseBdev3", 00:12:04.475 "aliases": [ 00:12:04.475 "8bf6d709-cca9-4ea1-997d-d77e341ee839" 00:12:04.475 ], 00:12:04.475 "product_name": "Malloc disk", 00:12:04.475 "block_size": 512, 00:12:04.475 "num_blocks": 65536, 00:12:04.475 "uuid": "8bf6d709-cca9-4ea1-997d-d77e341ee839", 00:12:04.475 "assigned_rate_limits": { 00:12:04.475 "rw_ios_per_sec": 0, 00:12:04.475 "rw_mbytes_per_sec": 0, 00:12:04.475 "r_mbytes_per_sec": 0, 00:12:04.475 "w_mbytes_per_sec": 0 00:12:04.475 }, 00:12:04.475 "claimed": true, 00:12:04.475 "claim_type": "exclusive_write", 00:12:04.475 "zoned": false, 00:12:04.475 "supported_io_types": { 00:12:04.475 "read": true, 00:12:04.475 "write": true, 00:12:04.475 "unmap": true, 00:12:04.475 "write_zeroes": true, 00:12:04.475 "flush": true, 00:12:04.475 "reset": true, 00:12:04.475 "compare": false, 00:12:04.475 "compare_and_write": false, 00:12:04.475 "abort": true, 00:12:04.475 "nvme_admin": false, 00:12:04.475 "nvme_io": false 00:12:04.475 }, 00:12:04.475 "memory_domains": [ 00:12:04.475 { 00:12:04.475 "dma_device_id": "system", 00:12:04.475 "dma_device_type": 1 00:12:04.475 }, 00:12:04.475 { 00:12:04.475 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:04.475 "dma_device_type": 2 00:12:04.475 } 00:12:04.475 ], 00:12:04.475 "driver_specific": {} 00:12:04.475 }' 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.475 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:04.734 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:04.734 15:51:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:04.734 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:04.993 [2024-06-10 15:51:10.386443] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:04.993 [2024-06-10 15:51:10.386467] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:04.993 [2024-06-10 15:51:10.386508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:04.993 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:05.252 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:05.252 "name": "Existed_Raid", 00:12:05.252 "uuid": "8f442748-09ff-4c07-8956-ecb6a5f3d6cf", 00:12:05.252 "strip_size_kb": 64, 00:12:05.252 "state": "offline", 00:12:05.252 "raid_level": "raid0", 00:12:05.252 "superblock": false, 00:12:05.252 "num_base_bdevs": 3, 00:12:05.252 "num_base_bdevs_discovered": 2, 00:12:05.252 "num_base_bdevs_operational": 2, 00:12:05.252 "base_bdevs_list": [ 00:12:05.252 { 00:12:05.252 "name": null, 00:12:05.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:05.252 "is_configured": false, 00:12:05.252 "data_offset": 0, 00:12:05.252 "data_size": 65536 00:12:05.252 }, 00:12:05.252 { 00:12:05.252 "name": "BaseBdev2", 00:12:05.252 "uuid": "d485ef27-a90e-461c-b571-0b22f6f7c60a", 00:12:05.252 "is_configured": true, 00:12:05.252 "data_offset": 0, 00:12:05.252 "data_size": 65536 00:12:05.252 }, 00:12:05.252 { 00:12:05.252 "name": "BaseBdev3", 00:12:05.252 "uuid": "8bf6d709-cca9-4ea1-997d-d77e341ee839", 00:12:05.252 "is_configured": true, 00:12:05.252 "data_offset": 0, 00:12:05.252 "data_size": 65536 00:12:05.252 } 00:12:05.252 ] 00:12:05.252 }' 00:12:05.252 15:51:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:05.252 15:51:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:05.819 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:05.819 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:05.819 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:05.819 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:06.078 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:06.078 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:06.078 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:06.337 [2024-06-10 15:51:11.735262] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:06.337 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:06.337 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.337 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.337 15:51:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:06.596 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:06.596 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:06.596 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:06.853 [2024-06-10 15:51:12.251238] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:06.853 [2024-06-10 15:51:12.251279] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21db8c0 name Existed_Raid, state offline 00:12:06.853 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:06.853 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:06.853 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:06.853 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:07.111 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:07.369 BaseBdev2 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:07.369 15:51:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:07.627 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:07.886 [ 00:12:07.886 { 00:12:07.886 "name": "BaseBdev2", 00:12:07.886 "aliases": [ 00:12:07.886 "f98b1f3c-ac88-4b02-a3ef-b9757da66877" 00:12:07.886 ], 00:12:07.886 "product_name": "Malloc disk", 00:12:07.886 "block_size": 512, 00:12:07.886 "num_blocks": 65536, 00:12:07.886 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:07.886 "assigned_rate_limits": { 00:12:07.886 "rw_ios_per_sec": 0, 00:12:07.886 "rw_mbytes_per_sec": 0, 00:12:07.886 "r_mbytes_per_sec": 0, 00:12:07.886 "w_mbytes_per_sec": 0 00:12:07.886 }, 00:12:07.886 "claimed": false, 00:12:07.886 "zoned": false, 00:12:07.886 "supported_io_types": { 00:12:07.886 "read": true, 00:12:07.886 "write": true, 00:12:07.886 "unmap": true, 00:12:07.886 "write_zeroes": true, 00:12:07.886 "flush": true, 00:12:07.886 "reset": true, 00:12:07.886 "compare": false, 00:12:07.886 "compare_and_write": false, 00:12:07.886 "abort": true, 00:12:07.886 "nvme_admin": false, 00:12:07.886 "nvme_io": false 00:12:07.886 }, 00:12:07.886 "memory_domains": [ 00:12:07.886 { 00:12:07.886 "dma_device_id": "system", 00:12:07.886 "dma_device_type": 1 00:12:07.886 }, 00:12:07.886 { 00:12:07.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:07.886 "dma_device_type": 2 00:12:07.886 } 00:12:07.886 ], 00:12:07.886 "driver_specific": {} 00:12:07.886 } 00:12:07.886 ] 00:12:07.886 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:07.886 15:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:07.886 15:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:07.886 15:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:08.145 BaseBdev3 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:08.145 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:08.403 15:51:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:08.662 [ 00:12:08.662 { 00:12:08.662 "name": "BaseBdev3", 00:12:08.662 "aliases": [ 00:12:08.662 "854aee89-b460-4f73-8789-a82432f4cde0" 00:12:08.662 ], 00:12:08.662 "product_name": "Malloc disk", 00:12:08.662 "block_size": 512, 00:12:08.662 "num_blocks": 65536, 00:12:08.662 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:08.662 "assigned_rate_limits": { 00:12:08.662 "rw_ios_per_sec": 0, 00:12:08.662 "rw_mbytes_per_sec": 0, 00:12:08.662 "r_mbytes_per_sec": 0, 00:12:08.662 "w_mbytes_per_sec": 0 00:12:08.662 }, 00:12:08.662 "claimed": false, 00:12:08.662 "zoned": false, 00:12:08.662 "supported_io_types": { 00:12:08.662 "read": true, 00:12:08.662 "write": true, 00:12:08.662 "unmap": true, 00:12:08.662 "write_zeroes": true, 00:12:08.662 "flush": true, 00:12:08.662 "reset": true, 00:12:08.662 "compare": false, 00:12:08.662 "compare_and_write": false, 00:12:08.662 "abort": true, 00:12:08.662 "nvme_admin": false, 00:12:08.662 "nvme_io": false 00:12:08.662 }, 00:12:08.662 "memory_domains": [ 00:12:08.662 { 00:12:08.662 "dma_device_id": "system", 00:12:08.662 "dma_device_type": 1 00:12:08.662 }, 00:12:08.662 { 00:12:08.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:08.662 "dma_device_type": 2 00:12:08.662 } 00:12:08.662 ], 00:12:08.662 "driver_specific": {} 00:12:08.662 } 00:12:08.662 ] 00:12:08.662 15:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:08.662 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:08.662 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:08.662 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:08.920 [2024-06-10 15:51:14.303789] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:08.920 [2024-06-10 15:51:14.303827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:08.920 [2024-06-10 15:51:14.303844] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:08.920 [2024-06-10 15:51:14.305237] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:08.920 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:09.179 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:09.179 "name": "Existed_Raid", 00:12:09.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.179 "strip_size_kb": 64, 00:12:09.179 "state": "configuring", 00:12:09.179 "raid_level": "raid0", 00:12:09.179 "superblock": false, 00:12:09.179 "num_base_bdevs": 3, 00:12:09.179 "num_base_bdevs_discovered": 2, 00:12:09.179 "num_base_bdevs_operational": 3, 00:12:09.179 "base_bdevs_list": [ 00:12:09.179 { 00:12:09.179 "name": "BaseBdev1", 00:12:09.179 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:09.179 "is_configured": false, 00:12:09.179 "data_offset": 0, 00:12:09.179 "data_size": 0 00:12:09.179 }, 00:12:09.179 { 00:12:09.179 "name": "BaseBdev2", 00:12:09.179 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:09.179 "is_configured": true, 00:12:09.179 "data_offset": 0, 00:12:09.179 "data_size": 65536 00:12:09.179 }, 00:12:09.179 { 00:12:09.179 "name": "BaseBdev3", 00:12:09.179 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:09.179 "is_configured": true, 00:12:09.179 "data_offset": 0, 00:12:09.179 "data_size": 65536 00:12:09.179 } 00:12:09.179 ] 00:12:09.179 }' 00:12:09.179 15:51:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:09.179 15:51:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:09.747 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:10.006 [2024-06-10 15:51:15.402723] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:10.006 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.007 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:10.266 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:10.266 "name": "Existed_Raid", 00:12:10.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:10.266 "strip_size_kb": 64, 00:12:10.266 "state": "configuring", 00:12:10.266 "raid_level": "raid0", 00:12:10.266 "superblock": false, 00:12:10.266 "num_base_bdevs": 3, 00:12:10.266 "num_base_bdevs_discovered": 1, 00:12:10.266 "num_base_bdevs_operational": 3, 00:12:10.266 "base_bdevs_list": [ 00:12:10.266 { 00:12:10.266 "name": "BaseBdev1", 00:12:10.266 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:10.266 "is_configured": false, 00:12:10.266 "data_offset": 0, 00:12:10.266 "data_size": 0 00:12:10.266 }, 00:12:10.266 { 00:12:10.266 "name": null, 00:12:10.266 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:10.266 "is_configured": false, 00:12:10.266 "data_offset": 0, 00:12:10.266 "data_size": 65536 00:12:10.266 }, 00:12:10.266 { 00:12:10.266 "name": "BaseBdev3", 00:12:10.266 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:10.266 "is_configured": true, 00:12:10.266 "data_offset": 0, 00:12:10.266 "data_size": 65536 00:12:10.266 } 00:12:10.266 ] 00:12:10.266 }' 00:12:10.266 15:51:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:10.266 15:51:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:10.834 15:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:10.834 15:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:11.093 15:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:11.093 15:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:11.351 [2024-06-10 15:51:16.709410] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:11.351 BaseBdev1 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:11.351 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:11.611 15:51:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:11.870 [ 00:12:11.870 { 00:12:11.870 "name": "BaseBdev1", 00:12:11.870 "aliases": [ 00:12:11.870 "324e6a4e-070c-46c2-b794-bcefaaa7d0e8" 00:12:11.870 ], 00:12:11.870 "product_name": "Malloc disk", 00:12:11.870 "block_size": 512, 00:12:11.870 "num_blocks": 65536, 00:12:11.870 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:11.870 "assigned_rate_limits": { 00:12:11.870 "rw_ios_per_sec": 0, 00:12:11.870 "rw_mbytes_per_sec": 0, 00:12:11.870 "r_mbytes_per_sec": 0, 00:12:11.870 "w_mbytes_per_sec": 0 00:12:11.870 }, 00:12:11.870 "claimed": true, 00:12:11.870 "claim_type": "exclusive_write", 00:12:11.870 "zoned": false, 00:12:11.870 "supported_io_types": { 00:12:11.870 "read": true, 00:12:11.870 "write": true, 00:12:11.870 "unmap": true, 00:12:11.870 "write_zeroes": true, 00:12:11.870 "flush": true, 00:12:11.870 "reset": true, 00:12:11.870 "compare": false, 00:12:11.870 "compare_and_write": false, 00:12:11.870 "abort": true, 00:12:11.870 "nvme_admin": false, 00:12:11.870 "nvme_io": false 00:12:11.870 }, 00:12:11.870 "memory_domains": [ 00:12:11.870 { 00:12:11.870 "dma_device_id": "system", 00:12:11.870 "dma_device_type": 1 00:12:11.870 }, 00:12:11.870 { 00:12:11.870 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:11.870 "dma_device_type": 2 00:12:11.870 } 00:12:11.870 ], 00:12:11.870 "driver_specific": {} 00:12:11.870 } 00:12:11.870 ] 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:11.870 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:12.130 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:12.130 "name": "Existed_Raid", 00:12:12.130 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:12.130 "strip_size_kb": 64, 00:12:12.130 "state": "configuring", 00:12:12.130 "raid_level": "raid0", 00:12:12.130 "superblock": false, 00:12:12.130 "num_base_bdevs": 3, 00:12:12.130 "num_base_bdevs_discovered": 2, 00:12:12.130 "num_base_bdevs_operational": 3, 00:12:12.130 "base_bdevs_list": [ 00:12:12.130 { 00:12:12.130 "name": "BaseBdev1", 00:12:12.130 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:12.130 "is_configured": true, 00:12:12.130 "data_offset": 0, 00:12:12.130 "data_size": 65536 00:12:12.130 }, 00:12:12.130 { 00:12:12.130 "name": null, 00:12:12.130 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:12.130 "is_configured": false, 00:12:12.130 "data_offset": 0, 00:12:12.130 "data_size": 65536 00:12:12.130 }, 00:12:12.130 { 00:12:12.130 "name": "BaseBdev3", 00:12:12.130 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:12.130 "is_configured": true, 00:12:12.130 "data_offset": 0, 00:12:12.130 "data_size": 65536 00:12:12.130 } 00:12:12.130 ] 00:12:12.130 }' 00:12:12.130 15:51:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:12.130 15:51:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:12.696 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:12.696 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:12.955 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:12.955 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:13.214 [2024-06-10 15:51:18.602766] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:13.214 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:13.473 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:13.473 "name": "Existed_Raid", 00:12:13.473 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:13.473 "strip_size_kb": 64, 00:12:13.473 "state": "configuring", 00:12:13.473 "raid_level": "raid0", 00:12:13.473 "superblock": false, 00:12:13.473 "num_base_bdevs": 3, 00:12:13.473 "num_base_bdevs_discovered": 1, 00:12:13.473 "num_base_bdevs_operational": 3, 00:12:13.473 "base_bdevs_list": [ 00:12:13.473 { 00:12:13.473 "name": "BaseBdev1", 00:12:13.473 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:13.473 "is_configured": true, 00:12:13.473 "data_offset": 0, 00:12:13.473 "data_size": 65536 00:12:13.473 }, 00:12:13.473 { 00:12:13.473 "name": null, 00:12:13.473 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:13.473 "is_configured": false, 00:12:13.473 "data_offset": 0, 00:12:13.473 "data_size": 65536 00:12:13.473 }, 00:12:13.473 { 00:12:13.473 "name": null, 00:12:13.473 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:13.473 "is_configured": false, 00:12:13.473 "data_offset": 0, 00:12:13.473 "data_size": 65536 00:12:13.473 } 00:12:13.473 ] 00:12:13.473 }' 00:12:13.473 15:51:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:13.473 15:51:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:14.040 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:14.040 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.298 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:14.298 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:14.556 [2024-06-10 15:51:19.858155] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:14.556 15:51:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:14.814 15:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:14.814 "name": "Existed_Raid", 00:12:14.814 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:14.814 "strip_size_kb": 64, 00:12:14.814 "state": "configuring", 00:12:14.814 "raid_level": "raid0", 00:12:14.815 "superblock": false, 00:12:14.815 "num_base_bdevs": 3, 00:12:14.815 "num_base_bdevs_discovered": 2, 00:12:14.815 "num_base_bdevs_operational": 3, 00:12:14.815 "base_bdevs_list": [ 00:12:14.815 { 00:12:14.815 "name": "BaseBdev1", 00:12:14.815 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:14.815 "is_configured": true, 00:12:14.815 "data_offset": 0, 00:12:14.815 "data_size": 65536 00:12:14.815 }, 00:12:14.815 { 00:12:14.815 "name": null, 00:12:14.815 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:14.815 "is_configured": false, 00:12:14.815 "data_offset": 0, 00:12:14.815 "data_size": 65536 00:12:14.815 }, 00:12:14.815 { 00:12:14.815 "name": "BaseBdev3", 00:12:14.815 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:14.815 "is_configured": true, 00:12:14.815 "data_offset": 0, 00:12:14.815 "data_size": 65536 00:12:14.815 } 00:12:14.815 ] 00:12:14.815 }' 00:12:14.815 15:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:14.815 15:51:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:15.381 15:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:15.381 15:51:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.640 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:15.640 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:15.899 [2024-06-10 15:51:21.249923] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:15.899 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:16.191 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:16.191 "name": "Existed_Raid", 00:12:16.191 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:16.191 "strip_size_kb": 64, 00:12:16.191 "state": "configuring", 00:12:16.191 "raid_level": "raid0", 00:12:16.191 "superblock": false, 00:12:16.191 "num_base_bdevs": 3, 00:12:16.191 "num_base_bdevs_discovered": 1, 00:12:16.191 "num_base_bdevs_operational": 3, 00:12:16.191 "base_bdevs_list": [ 00:12:16.191 { 00:12:16.191 "name": null, 00:12:16.191 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:16.191 "is_configured": false, 00:12:16.191 "data_offset": 0, 00:12:16.191 "data_size": 65536 00:12:16.191 }, 00:12:16.191 { 00:12:16.191 "name": null, 00:12:16.191 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:16.191 "is_configured": false, 00:12:16.191 "data_offset": 0, 00:12:16.191 "data_size": 65536 00:12:16.191 }, 00:12:16.191 { 00:12:16.191 "name": "BaseBdev3", 00:12:16.191 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:16.191 "is_configured": true, 00:12:16.191 "data_offset": 0, 00:12:16.191 "data_size": 65536 00:12:16.191 } 00:12:16.191 ] 00:12:16.191 }' 00:12:16.191 15:51:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:16.191 15:51:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:16.760 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:16.760 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:17.019 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:17.019 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:17.278 [2024-06-10 15:51:22.648256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:17.278 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:17.537 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:17.537 "name": "Existed_Raid", 00:12:17.537 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:17.537 "strip_size_kb": 64, 00:12:17.537 "state": "configuring", 00:12:17.537 "raid_level": "raid0", 00:12:17.537 "superblock": false, 00:12:17.537 "num_base_bdevs": 3, 00:12:17.537 "num_base_bdevs_discovered": 2, 00:12:17.537 "num_base_bdevs_operational": 3, 00:12:17.537 "base_bdevs_list": [ 00:12:17.537 { 00:12:17.537 "name": null, 00:12:17.537 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:17.537 "is_configured": false, 00:12:17.537 "data_offset": 0, 00:12:17.537 "data_size": 65536 00:12:17.537 }, 00:12:17.537 { 00:12:17.537 "name": "BaseBdev2", 00:12:17.537 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:17.537 "is_configured": true, 00:12:17.537 "data_offset": 0, 00:12:17.537 "data_size": 65536 00:12:17.537 }, 00:12:17.537 { 00:12:17.537 "name": "BaseBdev3", 00:12:17.537 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:17.537 "is_configured": true, 00:12:17.537 "data_offset": 0, 00:12:17.537 "data_size": 65536 00:12:17.537 } 00:12:17.537 ] 00:12:17.537 }' 00:12:17.537 15:51:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:17.537 15:51:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:18.104 15:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.104 15:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:18.363 15:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:18.363 15:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:18.363 15:51:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:18.622 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 324e6a4e-070c-46c2-b794-bcefaaa7d0e8 00:12:18.881 [2024-06-10 15:51:24.291984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:18.881 [2024-06-10 15:51:24.292018] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21dc4c0 00:12:18.881 [2024-06-10 15:51:24.292024] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:12:18.881 [2024-06-10 15:51:24.292220] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2384690 00:12:18.881 [2024-06-10 15:51:24.292348] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21dc4c0 00:12:18.881 [2024-06-10 15:51:24.292356] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21dc4c0 00:12:18.881 [2024-06-10 15:51:24.292521] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:18.881 NewBaseBdev 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:18.881 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:19.140 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:19.399 [ 00:12:19.399 { 00:12:19.399 "name": "NewBaseBdev", 00:12:19.399 "aliases": [ 00:12:19.399 "324e6a4e-070c-46c2-b794-bcefaaa7d0e8" 00:12:19.399 ], 00:12:19.399 "product_name": "Malloc disk", 00:12:19.399 "block_size": 512, 00:12:19.399 "num_blocks": 65536, 00:12:19.399 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:19.399 "assigned_rate_limits": { 00:12:19.399 "rw_ios_per_sec": 0, 00:12:19.399 "rw_mbytes_per_sec": 0, 00:12:19.399 "r_mbytes_per_sec": 0, 00:12:19.399 "w_mbytes_per_sec": 0 00:12:19.399 }, 00:12:19.399 "claimed": true, 00:12:19.399 "claim_type": "exclusive_write", 00:12:19.399 "zoned": false, 00:12:19.399 "supported_io_types": { 00:12:19.399 "read": true, 00:12:19.399 "write": true, 00:12:19.399 "unmap": true, 00:12:19.399 "write_zeroes": true, 00:12:19.399 "flush": true, 00:12:19.399 "reset": true, 00:12:19.399 "compare": false, 00:12:19.399 "compare_and_write": false, 00:12:19.399 "abort": true, 00:12:19.399 "nvme_admin": false, 00:12:19.399 "nvme_io": false 00:12:19.399 }, 00:12:19.399 "memory_domains": [ 00:12:19.399 { 00:12:19.399 "dma_device_id": "system", 00:12:19.399 "dma_device_type": 1 00:12:19.399 }, 00:12:19.399 { 00:12:19.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:19.399 "dma_device_type": 2 00:12:19.399 } 00:12:19.399 ], 00:12:19.399 "driver_specific": {} 00:12:19.399 } 00:12:19.399 ] 00:12:19.399 15:51:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:12:19.399 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:19.399 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:19.399 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:19.399 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:19.400 15:51:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:19.658 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:19.658 "name": "Existed_Raid", 00:12:19.658 "uuid": "3123bbaa-51f5-461d-b749-152e75103a95", 00:12:19.658 "strip_size_kb": 64, 00:12:19.658 "state": "online", 00:12:19.658 "raid_level": "raid0", 00:12:19.658 "superblock": false, 00:12:19.658 "num_base_bdevs": 3, 00:12:19.658 "num_base_bdevs_discovered": 3, 00:12:19.658 "num_base_bdevs_operational": 3, 00:12:19.658 "base_bdevs_list": [ 00:12:19.658 { 00:12:19.658 "name": "NewBaseBdev", 00:12:19.658 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:19.658 "is_configured": true, 00:12:19.658 "data_offset": 0, 00:12:19.658 "data_size": 65536 00:12:19.658 }, 00:12:19.658 { 00:12:19.658 "name": "BaseBdev2", 00:12:19.658 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:19.658 "is_configured": true, 00:12:19.658 "data_offset": 0, 00:12:19.658 "data_size": 65536 00:12:19.658 }, 00:12:19.658 { 00:12:19.658 "name": "BaseBdev3", 00:12:19.658 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:19.658 "is_configured": true, 00:12:19.658 "data_offset": 0, 00:12:19.658 "data_size": 65536 00:12:19.658 } 00:12:19.658 ] 00:12:19.658 }' 00:12:19.658 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:19.658 15:51:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:20.225 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:20.225 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:20.225 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:20.225 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:20.226 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:20.226 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:20.226 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:20.226 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:20.485 [2024-06-10 15:51:25.952716] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:20.485 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:20.485 "name": "Existed_Raid", 00:12:20.485 "aliases": [ 00:12:20.485 "3123bbaa-51f5-461d-b749-152e75103a95" 00:12:20.485 ], 00:12:20.485 "product_name": "Raid Volume", 00:12:20.485 "block_size": 512, 00:12:20.485 "num_blocks": 196608, 00:12:20.485 "uuid": "3123bbaa-51f5-461d-b749-152e75103a95", 00:12:20.485 "assigned_rate_limits": { 00:12:20.485 "rw_ios_per_sec": 0, 00:12:20.485 "rw_mbytes_per_sec": 0, 00:12:20.485 "r_mbytes_per_sec": 0, 00:12:20.485 "w_mbytes_per_sec": 0 00:12:20.485 }, 00:12:20.485 "claimed": false, 00:12:20.485 "zoned": false, 00:12:20.485 "supported_io_types": { 00:12:20.485 "read": true, 00:12:20.485 "write": true, 00:12:20.485 "unmap": true, 00:12:20.485 "write_zeroes": true, 00:12:20.485 "flush": true, 00:12:20.485 "reset": true, 00:12:20.485 "compare": false, 00:12:20.485 "compare_and_write": false, 00:12:20.485 "abort": false, 00:12:20.485 "nvme_admin": false, 00:12:20.485 "nvme_io": false 00:12:20.485 }, 00:12:20.485 "memory_domains": [ 00:12:20.485 { 00:12:20.485 "dma_device_id": "system", 00:12:20.485 "dma_device_type": 1 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.485 "dma_device_type": 2 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "dma_device_id": "system", 00:12:20.485 "dma_device_type": 1 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.485 "dma_device_type": 2 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "dma_device_id": "system", 00:12:20.485 "dma_device_type": 1 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:20.485 "dma_device_type": 2 00:12:20.485 } 00:12:20.485 ], 00:12:20.485 "driver_specific": { 00:12:20.485 "raid": { 00:12:20.485 "uuid": "3123bbaa-51f5-461d-b749-152e75103a95", 00:12:20.485 "strip_size_kb": 64, 00:12:20.485 "state": "online", 00:12:20.485 "raid_level": "raid0", 00:12:20.485 "superblock": false, 00:12:20.485 "num_base_bdevs": 3, 00:12:20.485 "num_base_bdevs_discovered": 3, 00:12:20.485 "num_base_bdevs_operational": 3, 00:12:20.485 "base_bdevs_list": [ 00:12:20.485 { 00:12:20.485 "name": "NewBaseBdev", 00:12:20.485 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:20.485 "is_configured": true, 00:12:20.485 "data_offset": 0, 00:12:20.485 "data_size": 65536 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "name": "BaseBdev2", 00:12:20.485 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:20.485 "is_configured": true, 00:12:20.485 "data_offset": 0, 00:12:20.485 "data_size": 65536 00:12:20.485 }, 00:12:20.485 { 00:12:20.485 "name": "BaseBdev3", 00:12:20.485 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:20.485 "is_configured": true, 00:12:20.485 "data_offset": 0, 00:12:20.485 "data_size": 65536 00:12:20.485 } 00:12:20.485 ] 00:12:20.485 } 00:12:20.485 } 00:12:20.485 }' 00:12:20.485 15:51:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:20.744 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:20.744 BaseBdev2 00:12:20.744 BaseBdev3' 00:12:20.744 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:20.744 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:20.744 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.004 "name": "NewBaseBdev", 00:12:21.004 "aliases": [ 00:12:21.004 "324e6a4e-070c-46c2-b794-bcefaaa7d0e8" 00:12:21.004 ], 00:12:21.004 "product_name": "Malloc disk", 00:12:21.004 "block_size": 512, 00:12:21.004 "num_blocks": 65536, 00:12:21.004 "uuid": "324e6a4e-070c-46c2-b794-bcefaaa7d0e8", 00:12:21.004 "assigned_rate_limits": { 00:12:21.004 "rw_ios_per_sec": 0, 00:12:21.004 "rw_mbytes_per_sec": 0, 00:12:21.004 "r_mbytes_per_sec": 0, 00:12:21.004 "w_mbytes_per_sec": 0 00:12:21.004 }, 00:12:21.004 "claimed": true, 00:12:21.004 "claim_type": "exclusive_write", 00:12:21.004 "zoned": false, 00:12:21.004 "supported_io_types": { 00:12:21.004 "read": true, 00:12:21.004 "write": true, 00:12:21.004 "unmap": true, 00:12:21.004 "write_zeroes": true, 00:12:21.004 "flush": true, 00:12:21.004 "reset": true, 00:12:21.004 "compare": false, 00:12:21.004 "compare_and_write": false, 00:12:21.004 "abort": true, 00:12:21.004 "nvme_admin": false, 00:12:21.004 "nvme_io": false 00:12:21.004 }, 00:12:21.004 "memory_domains": [ 00:12:21.004 { 00:12:21.004 "dma_device_id": "system", 00:12:21.004 "dma_device_type": 1 00:12:21.004 }, 00:12:21.004 { 00:12:21.004 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.004 "dma_device_type": 2 00:12:21.004 } 00:12:21.004 ], 00:12:21.004 "driver_specific": {} 00:12:21.004 }' 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.004 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:21.263 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:21.521 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:21.521 "name": "BaseBdev2", 00:12:21.521 "aliases": [ 00:12:21.521 "f98b1f3c-ac88-4b02-a3ef-b9757da66877" 00:12:21.521 ], 00:12:21.521 "product_name": "Malloc disk", 00:12:21.521 "block_size": 512, 00:12:21.521 "num_blocks": 65536, 00:12:21.521 "uuid": "f98b1f3c-ac88-4b02-a3ef-b9757da66877", 00:12:21.521 "assigned_rate_limits": { 00:12:21.521 "rw_ios_per_sec": 0, 00:12:21.521 "rw_mbytes_per_sec": 0, 00:12:21.521 "r_mbytes_per_sec": 0, 00:12:21.521 "w_mbytes_per_sec": 0 00:12:21.521 }, 00:12:21.521 "claimed": true, 00:12:21.521 "claim_type": "exclusive_write", 00:12:21.521 "zoned": false, 00:12:21.521 "supported_io_types": { 00:12:21.521 "read": true, 00:12:21.521 "write": true, 00:12:21.521 "unmap": true, 00:12:21.521 "write_zeroes": true, 00:12:21.521 "flush": true, 00:12:21.521 "reset": true, 00:12:21.521 "compare": false, 00:12:21.521 "compare_and_write": false, 00:12:21.521 "abort": true, 00:12:21.521 "nvme_admin": false, 00:12:21.521 "nvme_io": false 00:12:21.521 }, 00:12:21.522 "memory_domains": [ 00:12:21.522 { 00:12:21.522 "dma_device_id": "system", 00:12:21.522 "dma_device_type": 1 00:12:21.522 }, 00:12:21.522 { 00:12:21.522 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:21.522 "dma_device_type": 2 00:12:21.522 } 00:12:21.522 ], 00:12:21.522 "driver_specific": {} 00:12:21.522 }' 00:12:21.522 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.522 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:21.522 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:21.522 15:51:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.522 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:21.780 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:22.039 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:22.039 "name": "BaseBdev3", 00:12:22.039 "aliases": [ 00:12:22.039 "854aee89-b460-4f73-8789-a82432f4cde0" 00:12:22.039 ], 00:12:22.039 "product_name": "Malloc disk", 00:12:22.039 "block_size": 512, 00:12:22.039 "num_blocks": 65536, 00:12:22.039 "uuid": "854aee89-b460-4f73-8789-a82432f4cde0", 00:12:22.039 "assigned_rate_limits": { 00:12:22.039 "rw_ios_per_sec": 0, 00:12:22.039 "rw_mbytes_per_sec": 0, 00:12:22.039 "r_mbytes_per_sec": 0, 00:12:22.039 "w_mbytes_per_sec": 0 00:12:22.039 }, 00:12:22.039 "claimed": true, 00:12:22.039 "claim_type": "exclusive_write", 00:12:22.039 "zoned": false, 00:12:22.039 "supported_io_types": { 00:12:22.039 "read": true, 00:12:22.039 "write": true, 00:12:22.039 "unmap": true, 00:12:22.039 "write_zeroes": true, 00:12:22.039 "flush": true, 00:12:22.039 "reset": true, 00:12:22.039 "compare": false, 00:12:22.039 "compare_and_write": false, 00:12:22.039 "abort": true, 00:12:22.039 "nvme_admin": false, 00:12:22.039 "nvme_io": false 00:12:22.039 }, 00:12:22.039 "memory_domains": [ 00:12:22.039 { 00:12:22.039 "dma_device_id": "system", 00:12:22.039 "dma_device_type": 1 00:12:22.039 }, 00:12:22.039 { 00:12:22.039 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:22.039 "dma_device_type": 2 00:12:22.039 } 00:12:22.039 ], 00:12:22.039 "driver_specific": {} 00:12:22.039 }' 00:12:22.039 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.298 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:22.557 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:22.557 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.557 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:22.557 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:22.557 15:51:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:22.816 [2024-06-10 15:51:28.138286] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:22.816 [2024-06-10 15:51:28.138308] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:22.816 [2024-06-10 15:51:28.138357] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:22.816 [2024-06-10 15:51:28.138406] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:22.816 [2024-06-10 15:51:28.138415] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21dc4c0 name Existed_Raid, state offline 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2656326 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2656326 ']' 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2656326 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2656326 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2656326' 00:12:22.816 killing process with pid 2656326 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2656326 00:12:22.816 [2024-06-10 15:51:28.201268] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:22.816 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2656326 00:12:22.816 [2024-06-10 15:51:28.225619] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:12:23.075 00:12:23.075 real 0m29.199s 00:12:23.075 user 0m54.797s 00:12:23.075 sys 0m4.038s 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:12:23.075 ************************************ 00:12:23.075 END TEST raid_state_function_test 00:12:23.075 ************************************ 00:12:23.075 15:51:28 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 3 true 00:12:23.075 15:51:28 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:12:23.075 15:51:28 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:23.075 15:51:28 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:23.075 ************************************ 00:12:23.075 START TEST raid_state_function_test_sb 00:12:23.075 ************************************ 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 3 true 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:12:23.075 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2662281 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2662281' 00:12:23.076 Process raid pid: 2662281 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2662281 /var/tmp/spdk-raid.sock 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2662281 ']' 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:23.076 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:23.076 15:51:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:23.076 [2024-06-10 15:51:28.555245] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:12:23.076 [2024-06-10 15:51:28.555297] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:12:23.335 [2024-06-10 15:51:28.657785] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:23.335 [2024-06-10 15:51:28.752982] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:23.335 [2024-06-10 15:51:28.812061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:23.335 [2024-06-10 15:51:28.812093] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:24.272 [2024-06-10 15:51:29.747117] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:24.272 [2024-06-10 15:51:29.747156] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:24.272 [2024-06-10 15:51:29.747165] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:24.272 [2024-06-10 15:51:29.747174] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:24.272 [2024-06-10 15:51:29.747181] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:24.272 [2024-06-10 15:51:29.747190] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:24.272 15:51:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:24.531 15:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:24.531 "name": "Existed_Raid", 00:12:24.531 "uuid": "28b58ef3-cc5c-4527-9933-c278bbbb148b", 00:12:24.531 "strip_size_kb": 64, 00:12:24.531 "state": "configuring", 00:12:24.531 "raid_level": "raid0", 00:12:24.531 "superblock": true, 00:12:24.531 "num_base_bdevs": 3, 00:12:24.531 "num_base_bdevs_discovered": 0, 00:12:24.531 "num_base_bdevs_operational": 3, 00:12:24.531 "base_bdevs_list": [ 00:12:24.531 { 00:12:24.531 "name": "BaseBdev1", 00:12:24.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.531 "is_configured": false, 00:12:24.531 "data_offset": 0, 00:12:24.531 "data_size": 0 00:12:24.531 }, 00:12:24.531 { 00:12:24.531 "name": "BaseBdev2", 00:12:24.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.531 "is_configured": false, 00:12:24.531 "data_offset": 0, 00:12:24.531 "data_size": 0 00:12:24.531 }, 00:12:24.531 { 00:12:24.531 "name": "BaseBdev3", 00:12:24.531 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:24.531 "is_configured": false, 00:12:24.531 "data_offset": 0, 00:12:24.531 "data_size": 0 00:12:24.531 } 00:12:24.531 ] 00:12:24.531 }' 00:12:24.531 15:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:24.531 15:51:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:25.468 15:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:25.468 [2024-06-10 15:51:30.894026] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:25.468 [2024-06-10 15:51:30.894057] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b25120 name Existed_Raid, state configuring 00:12:25.468 15:51:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:25.727 [2024-06-10 15:51:31.150724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:25.727 [2024-06-10 15:51:31.150748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:25.727 [2024-06-10 15:51:31.150756] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:25.727 [2024-06-10 15:51:31.150764] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:25.727 [2024-06-10 15:51:31.150771] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:25.727 [2024-06-10 15:51:31.150779] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:25.727 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:25.986 [2024-06-10 15:51:31.421026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:25.986 BaseBdev1 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:25.986 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:26.245 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:26.504 [ 00:12:26.504 { 00:12:26.504 "name": "BaseBdev1", 00:12:26.504 "aliases": [ 00:12:26.504 "b4b469ac-e337-4bbd-8f09-22caaf2681b2" 00:12:26.504 ], 00:12:26.504 "product_name": "Malloc disk", 00:12:26.504 "block_size": 512, 00:12:26.504 "num_blocks": 65536, 00:12:26.504 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:26.504 "assigned_rate_limits": { 00:12:26.504 "rw_ios_per_sec": 0, 00:12:26.504 "rw_mbytes_per_sec": 0, 00:12:26.504 "r_mbytes_per_sec": 0, 00:12:26.504 "w_mbytes_per_sec": 0 00:12:26.504 }, 00:12:26.504 "claimed": true, 00:12:26.504 "claim_type": "exclusive_write", 00:12:26.504 "zoned": false, 00:12:26.504 "supported_io_types": { 00:12:26.504 "read": true, 00:12:26.504 "write": true, 00:12:26.504 "unmap": true, 00:12:26.504 "write_zeroes": true, 00:12:26.504 "flush": true, 00:12:26.504 "reset": true, 00:12:26.504 "compare": false, 00:12:26.504 "compare_and_write": false, 00:12:26.504 "abort": true, 00:12:26.504 "nvme_admin": false, 00:12:26.504 "nvme_io": false 00:12:26.504 }, 00:12:26.504 "memory_domains": [ 00:12:26.504 { 00:12:26.504 "dma_device_id": "system", 00:12:26.504 "dma_device_type": 1 00:12:26.504 }, 00:12:26.504 { 00:12:26.504 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:26.504 "dma_device_type": 2 00:12:26.504 } 00:12:26.504 ], 00:12:26.504 "driver_specific": {} 00:12:26.504 } 00:12:26.504 ] 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:26.504 15:51:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:26.770 15:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:26.770 "name": "Existed_Raid", 00:12:26.770 "uuid": "c5e48e2e-2dfd-4788-bd7d-36c2c533d275", 00:12:26.770 "strip_size_kb": 64, 00:12:26.770 "state": "configuring", 00:12:26.770 "raid_level": "raid0", 00:12:26.770 "superblock": true, 00:12:26.770 "num_base_bdevs": 3, 00:12:26.770 "num_base_bdevs_discovered": 1, 00:12:26.770 "num_base_bdevs_operational": 3, 00:12:26.770 "base_bdevs_list": [ 00:12:26.770 { 00:12:26.770 "name": "BaseBdev1", 00:12:26.770 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:26.770 "is_configured": true, 00:12:26.770 "data_offset": 2048, 00:12:26.770 "data_size": 63488 00:12:26.770 }, 00:12:26.770 { 00:12:26.770 "name": "BaseBdev2", 00:12:26.770 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.771 "is_configured": false, 00:12:26.771 "data_offset": 0, 00:12:26.771 "data_size": 0 00:12:26.771 }, 00:12:26.771 { 00:12:26.771 "name": "BaseBdev3", 00:12:26.771 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:26.771 "is_configured": false, 00:12:26.771 "data_offset": 0, 00:12:26.771 "data_size": 0 00:12:26.771 } 00:12:26.771 ] 00:12:26.771 }' 00:12:26.771 15:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:26.771 15:51:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:27.707 15:51:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:27.707 [2024-06-10 15:51:33.093661] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:27.707 [2024-06-10 15:51:33.093699] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b249b0 name Existed_Raid, state configuring 00:12:27.707 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:27.966 [2024-06-10 15:51:33.346377] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:27.966 [2024-06-10 15:51:33.347921] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:12:27.966 [2024-06-10 15:51:33.347953] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:12:27.966 [2024-06-10 15:51:33.347974] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:12:27.966 [2024-06-10 15:51:33.347987] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:27.966 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:27.967 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:27.967 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:28.225 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:28.225 "name": "Existed_Raid", 00:12:28.225 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:28.225 "strip_size_kb": 64, 00:12:28.225 "state": "configuring", 00:12:28.225 "raid_level": "raid0", 00:12:28.225 "superblock": true, 00:12:28.225 "num_base_bdevs": 3, 00:12:28.225 "num_base_bdevs_discovered": 1, 00:12:28.225 "num_base_bdevs_operational": 3, 00:12:28.225 "base_bdevs_list": [ 00:12:28.225 { 00:12:28.225 "name": "BaseBdev1", 00:12:28.225 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:28.225 "is_configured": true, 00:12:28.225 "data_offset": 2048, 00:12:28.225 "data_size": 63488 00:12:28.225 }, 00:12:28.225 { 00:12:28.225 "name": "BaseBdev2", 00:12:28.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.225 "is_configured": false, 00:12:28.225 "data_offset": 0, 00:12:28.225 "data_size": 0 00:12:28.225 }, 00:12:28.225 { 00:12:28.225 "name": "BaseBdev3", 00:12:28.225 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:28.225 "is_configured": false, 00:12:28.225 "data_offset": 0, 00:12:28.225 "data_size": 0 00:12:28.225 } 00:12:28.225 ] 00:12:28.225 }' 00:12:28.225 15:51:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:28.225 15:51:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:28.794 15:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:29.053 [2024-06-10 15:51:34.488559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:29.053 BaseBdev2 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:29.053 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:29.312 15:51:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:29.571 [ 00:12:29.571 { 00:12:29.571 "name": "BaseBdev2", 00:12:29.571 "aliases": [ 00:12:29.571 "0d77c7b4-94b7-4802-9b76-378b3151d755" 00:12:29.571 ], 00:12:29.571 "product_name": "Malloc disk", 00:12:29.571 "block_size": 512, 00:12:29.571 "num_blocks": 65536, 00:12:29.571 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:29.571 "assigned_rate_limits": { 00:12:29.571 "rw_ios_per_sec": 0, 00:12:29.571 "rw_mbytes_per_sec": 0, 00:12:29.571 "r_mbytes_per_sec": 0, 00:12:29.571 "w_mbytes_per_sec": 0 00:12:29.571 }, 00:12:29.571 "claimed": true, 00:12:29.571 "claim_type": "exclusive_write", 00:12:29.571 "zoned": false, 00:12:29.571 "supported_io_types": { 00:12:29.571 "read": true, 00:12:29.572 "write": true, 00:12:29.572 "unmap": true, 00:12:29.572 "write_zeroes": true, 00:12:29.572 "flush": true, 00:12:29.572 "reset": true, 00:12:29.572 "compare": false, 00:12:29.572 "compare_and_write": false, 00:12:29.572 "abort": true, 00:12:29.572 "nvme_admin": false, 00:12:29.572 "nvme_io": false 00:12:29.572 }, 00:12:29.572 "memory_domains": [ 00:12:29.572 { 00:12:29.572 "dma_device_id": "system", 00:12:29.572 "dma_device_type": 1 00:12:29.572 }, 00:12:29.572 { 00:12:29.572 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:29.572 "dma_device_type": 2 00:12:29.572 } 00:12:29.572 ], 00:12:29.572 "driver_specific": {} 00:12:29.572 } 00:12:29.572 ] 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:29.572 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:29.831 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:29.831 "name": "Existed_Raid", 00:12:29.831 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:29.831 "strip_size_kb": 64, 00:12:29.831 "state": "configuring", 00:12:29.831 "raid_level": "raid0", 00:12:29.831 "superblock": true, 00:12:29.831 "num_base_bdevs": 3, 00:12:29.831 "num_base_bdevs_discovered": 2, 00:12:29.831 "num_base_bdevs_operational": 3, 00:12:29.831 "base_bdevs_list": [ 00:12:29.831 { 00:12:29.831 "name": "BaseBdev1", 00:12:29.831 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:29.831 "is_configured": true, 00:12:29.831 "data_offset": 2048, 00:12:29.831 "data_size": 63488 00:12:29.831 }, 00:12:29.831 { 00:12:29.831 "name": "BaseBdev2", 00:12:29.831 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:29.831 "is_configured": true, 00:12:29.831 "data_offset": 2048, 00:12:29.831 "data_size": 63488 00:12:29.831 }, 00:12:29.831 { 00:12:29.831 "name": "BaseBdev3", 00:12:29.831 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:29.831 "is_configured": false, 00:12:29.831 "data_offset": 0, 00:12:29.831 "data_size": 0 00:12:29.831 } 00:12:29.831 ] 00:12:29.831 }' 00:12:29.831 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:29.831 15:51:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:30.398 15:51:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:30.694 [2024-06-10 15:51:36.128280] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:30.694 [2024-06-10 15:51:36.128432] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b258c0 00:12:30.694 [2024-06-10 15:51:36.128445] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:30.694 [2024-06-10 15:51:36.128637] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b3c830 00:12:30.694 [2024-06-10 15:51:36.128759] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b258c0 00:12:30.694 [2024-06-10 15:51:36.128767] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b258c0 00:12:30.694 [2024-06-10 15:51:36.128860] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:30.694 BaseBdev3 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:30.694 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:30.953 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:31.212 [ 00:12:31.212 { 00:12:31.212 "name": "BaseBdev3", 00:12:31.212 "aliases": [ 00:12:31.212 "784c030f-22c2-49f0-bce7-ec8b984cd01c" 00:12:31.212 ], 00:12:31.212 "product_name": "Malloc disk", 00:12:31.212 "block_size": 512, 00:12:31.212 "num_blocks": 65536, 00:12:31.212 "uuid": "784c030f-22c2-49f0-bce7-ec8b984cd01c", 00:12:31.212 "assigned_rate_limits": { 00:12:31.212 "rw_ios_per_sec": 0, 00:12:31.212 "rw_mbytes_per_sec": 0, 00:12:31.212 "r_mbytes_per_sec": 0, 00:12:31.212 "w_mbytes_per_sec": 0 00:12:31.212 }, 00:12:31.212 "claimed": true, 00:12:31.212 "claim_type": "exclusive_write", 00:12:31.212 "zoned": false, 00:12:31.212 "supported_io_types": { 00:12:31.212 "read": true, 00:12:31.212 "write": true, 00:12:31.212 "unmap": true, 00:12:31.212 "write_zeroes": true, 00:12:31.212 "flush": true, 00:12:31.212 "reset": true, 00:12:31.212 "compare": false, 00:12:31.212 "compare_and_write": false, 00:12:31.212 "abort": true, 00:12:31.212 "nvme_admin": false, 00:12:31.212 "nvme_io": false 00:12:31.212 }, 00:12:31.212 "memory_domains": [ 00:12:31.212 { 00:12:31.212 "dma_device_id": "system", 00:12:31.212 "dma_device_type": 1 00:12:31.212 }, 00:12:31.212 { 00:12:31.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:31.212 "dma_device_type": 2 00:12:31.212 } 00:12:31.212 ], 00:12:31.212 "driver_specific": {} 00:12:31.212 } 00:12:31.212 ] 00:12:31.212 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:31.212 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:12:31.212 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:12:31.212 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:31.213 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:31.472 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:31.472 "name": "Existed_Raid", 00:12:31.472 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:31.472 "strip_size_kb": 64, 00:12:31.472 "state": "online", 00:12:31.472 "raid_level": "raid0", 00:12:31.472 "superblock": true, 00:12:31.472 "num_base_bdevs": 3, 00:12:31.472 "num_base_bdevs_discovered": 3, 00:12:31.472 "num_base_bdevs_operational": 3, 00:12:31.472 "base_bdevs_list": [ 00:12:31.472 { 00:12:31.472 "name": "BaseBdev1", 00:12:31.472 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:31.472 "is_configured": true, 00:12:31.472 "data_offset": 2048, 00:12:31.472 "data_size": 63488 00:12:31.472 }, 00:12:31.472 { 00:12:31.472 "name": "BaseBdev2", 00:12:31.472 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:31.472 "is_configured": true, 00:12:31.472 "data_offset": 2048, 00:12:31.472 "data_size": 63488 00:12:31.472 }, 00:12:31.472 { 00:12:31.472 "name": "BaseBdev3", 00:12:31.472 "uuid": "784c030f-22c2-49f0-bce7-ec8b984cd01c", 00:12:31.472 "is_configured": true, 00:12:31.472 "data_offset": 2048, 00:12:31.472 "data_size": 63488 00:12:31.472 } 00:12:31.472 ] 00:12:31.472 }' 00:12:31.472 15:51:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:31.472 15:51:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:32.039 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:32.297 [2024-06-10 15:51:37.768999] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:32.297 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:32.297 "name": "Existed_Raid", 00:12:32.297 "aliases": [ 00:12:32.297 "5f4a397e-43c4-43e6-a8f9-c34c52556432" 00:12:32.297 ], 00:12:32.297 "product_name": "Raid Volume", 00:12:32.297 "block_size": 512, 00:12:32.297 "num_blocks": 190464, 00:12:32.298 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:32.298 "assigned_rate_limits": { 00:12:32.298 "rw_ios_per_sec": 0, 00:12:32.298 "rw_mbytes_per_sec": 0, 00:12:32.298 "r_mbytes_per_sec": 0, 00:12:32.298 "w_mbytes_per_sec": 0 00:12:32.298 }, 00:12:32.298 "claimed": false, 00:12:32.298 "zoned": false, 00:12:32.298 "supported_io_types": { 00:12:32.298 "read": true, 00:12:32.298 "write": true, 00:12:32.298 "unmap": true, 00:12:32.298 "write_zeroes": true, 00:12:32.298 "flush": true, 00:12:32.298 "reset": true, 00:12:32.298 "compare": false, 00:12:32.298 "compare_and_write": false, 00:12:32.298 "abort": false, 00:12:32.298 "nvme_admin": false, 00:12:32.298 "nvme_io": false 00:12:32.298 }, 00:12:32.298 "memory_domains": [ 00:12:32.298 { 00:12:32.298 "dma_device_id": "system", 00:12:32.298 "dma_device_type": 1 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.298 "dma_device_type": 2 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "dma_device_id": "system", 00:12:32.298 "dma_device_type": 1 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.298 "dma_device_type": 2 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "dma_device_id": "system", 00:12:32.298 "dma_device_type": 1 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.298 "dma_device_type": 2 00:12:32.298 } 00:12:32.298 ], 00:12:32.298 "driver_specific": { 00:12:32.298 "raid": { 00:12:32.298 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:32.298 "strip_size_kb": 64, 00:12:32.298 "state": "online", 00:12:32.298 "raid_level": "raid0", 00:12:32.298 "superblock": true, 00:12:32.298 "num_base_bdevs": 3, 00:12:32.298 "num_base_bdevs_discovered": 3, 00:12:32.298 "num_base_bdevs_operational": 3, 00:12:32.298 "base_bdevs_list": [ 00:12:32.298 { 00:12:32.298 "name": "BaseBdev1", 00:12:32.298 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:32.298 "is_configured": true, 00:12:32.298 "data_offset": 2048, 00:12:32.298 "data_size": 63488 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "name": "BaseBdev2", 00:12:32.298 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:32.298 "is_configured": true, 00:12:32.298 "data_offset": 2048, 00:12:32.298 "data_size": 63488 00:12:32.298 }, 00:12:32.298 { 00:12:32.298 "name": "BaseBdev3", 00:12:32.298 "uuid": "784c030f-22c2-49f0-bce7-ec8b984cd01c", 00:12:32.298 "is_configured": true, 00:12:32.298 "data_offset": 2048, 00:12:32.298 "data_size": 63488 00:12:32.298 } 00:12:32.298 ] 00:12:32.298 } 00:12:32.298 } 00:12:32.298 }' 00:12:32.298 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:32.556 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:12:32.556 BaseBdev2 00:12:32.556 BaseBdev3' 00:12:32.556 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:32.556 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:12:32.556 15:51:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:32.815 "name": "BaseBdev1", 00:12:32.815 "aliases": [ 00:12:32.815 "b4b469ac-e337-4bbd-8f09-22caaf2681b2" 00:12:32.815 ], 00:12:32.815 "product_name": "Malloc disk", 00:12:32.815 "block_size": 512, 00:12:32.815 "num_blocks": 65536, 00:12:32.815 "uuid": "b4b469ac-e337-4bbd-8f09-22caaf2681b2", 00:12:32.815 "assigned_rate_limits": { 00:12:32.815 "rw_ios_per_sec": 0, 00:12:32.815 "rw_mbytes_per_sec": 0, 00:12:32.815 "r_mbytes_per_sec": 0, 00:12:32.815 "w_mbytes_per_sec": 0 00:12:32.815 }, 00:12:32.815 "claimed": true, 00:12:32.815 "claim_type": "exclusive_write", 00:12:32.815 "zoned": false, 00:12:32.815 "supported_io_types": { 00:12:32.815 "read": true, 00:12:32.815 "write": true, 00:12:32.815 "unmap": true, 00:12:32.815 "write_zeroes": true, 00:12:32.815 "flush": true, 00:12:32.815 "reset": true, 00:12:32.815 "compare": false, 00:12:32.815 "compare_and_write": false, 00:12:32.815 "abort": true, 00:12:32.815 "nvme_admin": false, 00:12:32.815 "nvme_io": false 00:12:32.815 }, 00:12:32.815 "memory_domains": [ 00:12:32.815 { 00:12:32.815 "dma_device_id": "system", 00:12:32.815 "dma_device_type": 1 00:12:32.815 }, 00:12:32.815 { 00:12:32.815 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:32.815 "dma_device_type": 2 00:12:32.815 } 00:12:32.815 ], 00:12:32.815 "driver_specific": {} 00:12:32.815 }' 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:32.815 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:33.074 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:33.333 "name": "BaseBdev2", 00:12:33.333 "aliases": [ 00:12:33.333 "0d77c7b4-94b7-4802-9b76-378b3151d755" 00:12:33.333 ], 00:12:33.333 "product_name": "Malloc disk", 00:12:33.333 "block_size": 512, 00:12:33.333 "num_blocks": 65536, 00:12:33.333 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:33.333 "assigned_rate_limits": { 00:12:33.333 "rw_ios_per_sec": 0, 00:12:33.333 "rw_mbytes_per_sec": 0, 00:12:33.333 "r_mbytes_per_sec": 0, 00:12:33.333 "w_mbytes_per_sec": 0 00:12:33.333 }, 00:12:33.333 "claimed": true, 00:12:33.333 "claim_type": "exclusive_write", 00:12:33.333 "zoned": false, 00:12:33.333 "supported_io_types": { 00:12:33.333 "read": true, 00:12:33.333 "write": true, 00:12:33.333 "unmap": true, 00:12:33.333 "write_zeroes": true, 00:12:33.333 "flush": true, 00:12:33.333 "reset": true, 00:12:33.333 "compare": false, 00:12:33.333 "compare_and_write": false, 00:12:33.333 "abort": true, 00:12:33.333 "nvme_admin": false, 00:12:33.333 "nvme_io": false 00:12:33.333 }, 00:12:33.333 "memory_domains": [ 00:12:33.333 { 00:12:33.333 "dma_device_id": "system", 00:12:33.333 "dma_device_type": 1 00:12:33.333 }, 00:12:33.333 { 00:12:33.333 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.333 "dma_device_type": 2 00:12:33.333 } 00:12:33.333 ], 00:12:33.333 "driver_specific": {} 00:12:33.333 }' 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.333 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:33.592 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:33.592 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.592 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:33.592 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:33.592 15:51:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.592 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:33.592 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:33.592 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:33.592 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:33.592 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:33.850 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:33.850 "name": "BaseBdev3", 00:12:33.850 "aliases": [ 00:12:33.850 "784c030f-22c2-49f0-bce7-ec8b984cd01c" 00:12:33.851 ], 00:12:33.851 "product_name": "Malloc disk", 00:12:33.851 "block_size": 512, 00:12:33.851 "num_blocks": 65536, 00:12:33.851 "uuid": "784c030f-22c2-49f0-bce7-ec8b984cd01c", 00:12:33.851 "assigned_rate_limits": { 00:12:33.851 "rw_ios_per_sec": 0, 00:12:33.851 "rw_mbytes_per_sec": 0, 00:12:33.851 "r_mbytes_per_sec": 0, 00:12:33.851 "w_mbytes_per_sec": 0 00:12:33.851 }, 00:12:33.851 "claimed": true, 00:12:33.851 "claim_type": "exclusive_write", 00:12:33.851 "zoned": false, 00:12:33.851 "supported_io_types": { 00:12:33.851 "read": true, 00:12:33.851 "write": true, 00:12:33.851 "unmap": true, 00:12:33.851 "write_zeroes": true, 00:12:33.851 "flush": true, 00:12:33.851 "reset": true, 00:12:33.851 "compare": false, 00:12:33.851 "compare_and_write": false, 00:12:33.851 "abort": true, 00:12:33.851 "nvme_admin": false, 00:12:33.851 "nvme_io": false 00:12:33.851 }, 00:12:33.851 "memory_domains": [ 00:12:33.851 { 00:12:33.851 "dma_device_id": "system", 00:12:33.851 "dma_device_type": 1 00:12:33.851 }, 00:12:33.851 { 00:12:33.851 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:33.851 "dma_device_type": 2 00:12:33.851 } 00:12:33.851 ], 00:12:33.851 "driver_specific": {} 00:12:33.851 }' 00:12:33.851 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:34.109 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.368 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:34.368 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:34.368 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:34.626 [2024-06-10 15:51:39.926543] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:34.626 [2024-06-10 15:51:39.926568] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:34.626 [2024-06-10 15:51:39.926608] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:34.626 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:12:34.626 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 2 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:34.627 15:51:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:34.886 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:34.886 "name": "Existed_Raid", 00:12:34.886 "uuid": "5f4a397e-43c4-43e6-a8f9-c34c52556432", 00:12:34.886 "strip_size_kb": 64, 00:12:34.886 "state": "offline", 00:12:34.886 "raid_level": "raid0", 00:12:34.886 "superblock": true, 00:12:34.886 "num_base_bdevs": 3, 00:12:34.886 "num_base_bdevs_discovered": 2, 00:12:34.886 "num_base_bdevs_operational": 2, 00:12:34.886 "base_bdevs_list": [ 00:12:34.886 { 00:12:34.886 "name": null, 00:12:34.886 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:34.886 "is_configured": false, 00:12:34.886 "data_offset": 2048, 00:12:34.886 "data_size": 63488 00:12:34.886 }, 00:12:34.886 { 00:12:34.886 "name": "BaseBdev2", 00:12:34.886 "uuid": "0d77c7b4-94b7-4802-9b76-378b3151d755", 00:12:34.886 "is_configured": true, 00:12:34.886 "data_offset": 2048, 00:12:34.886 "data_size": 63488 00:12:34.886 }, 00:12:34.886 { 00:12:34.886 "name": "BaseBdev3", 00:12:34.886 "uuid": "784c030f-22c2-49f0-bce7-ec8b984cd01c", 00:12:34.886 "is_configured": true, 00:12:34.886 "data_offset": 2048, 00:12:34.886 "data_size": 63488 00:12:34.886 } 00:12:34.886 ] 00:12:34.886 }' 00:12:34.886 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:34.886 15:51:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:35.454 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:12:35.454 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:35.454 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.454 15:51:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:35.713 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:35.713 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:35.713 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:12:35.971 [2024-06-10 15:51:41.339469] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:35.971 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:35.971 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:35.972 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:35.972 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:12:36.230 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:12:36.230 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:12:36.230 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:12:36.489 [2024-06-10 15:51:41.859281] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:36.489 [2024-06-10 15:51:41.859320] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b258c0 name Existed_Raid, state offline 00:12:36.489 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:12:36.489 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:12:36.489 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:36.489 15:51:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:36.749 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:12:37.008 BaseBdev2 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:37.008 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:37.267 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:12:37.526 [ 00:12:37.526 { 00:12:37.526 "name": "BaseBdev2", 00:12:37.526 "aliases": [ 00:12:37.526 "907612c7-8045-477a-b823-2167e0843d2f" 00:12:37.526 ], 00:12:37.526 "product_name": "Malloc disk", 00:12:37.526 "block_size": 512, 00:12:37.526 "num_blocks": 65536, 00:12:37.526 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:37.526 "assigned_rate_limits": { 00:12:37.526 "rw_ios_per_sec": 0, 00:12:37.526 "rw_mbytes_per_sec": 0, 00:12:37.526 "r_mbytes_per_sec": 0, 00:12:37.526 "w_mbytes_per_sec": 0 00:12:37.526 }, 00:12:37.526 "claimed": false, 00:12:37.526 "zoned": false, 00:12:37.526 "supported_io_types": { 00:12:37.526 "read": true, 00:12:37.526 "write": true, 00:12:37.526 "unmap": true, 00:12:37.526 "write_zeroes": true, 00:12:37.526 "flush": true, 00:12:37.526 "reset": true, 00:12:37.526 "compare": false, 00:12:37.526 "compare_and_write": false, 00:12:37.526 "abort": true, 00:12:37.526 "nvme_admin": false, 00:12:37.526 "nvme_io": false 00:12:37.526 }, 00:12:37.526 "memory_domains": [ 00:12:37.526 { 00:12:37.526 "dma_device_id": "system", 00:12:37.526 "dma_device_type": 1 00:12:37.526 }, 00:12:37.526 { 00:12:37.526 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:37.526 "dma_device_type": 2 00:12:37.526 } 00:12:37.526 ], 00:12:37.526 "driver_specific": {} 00:12:37.526 } 00:12:37.526 ] 00:12:37.526 15:51:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:37.526 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:37.526 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:37.526 15:51:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:12:37.785 BaseBdev3 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:37.785 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:38.043 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:12:38.303 [ 00:12:38.303 { 00:12:38.303 "name": "BaseBdev3", 00:12:38.303 "aliases": [ 00:12:38.303 "fd257a8b-539e-443e-8076-d42413a98818" 00:12:38.303 ], 00:12:38.303 "product_name": "Malloc disk", 00:12:38.303 "block_size": 512, 00:12:38.303 "num_blocks": 65536, 00:12:38.303 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:38.303 "assigned_rate_limits": { 00:12:38.303 "rw_ios_per_sec": 0, 00:12:38.303 "rw_mbytes_per_sec": 0, 00:12:38.303 "r_mbytes_per_sec": 0, 00:12:38.303 "w_mbytes_per_sec": 0 00:12:38.303 }, 00:12:38.303 "claimed": false, 00:12:38.303 "zoned": false, 00:12:38.303 "supported_io_types": { 00:12:38.303 "read": true, 00:12:38.303 "write": true, 00:12:38.303 "unmap": true, 00:12:38.303 "write_zeroes": true, 00:12:38.303 "flush": true, 00:12:38.303 "reset": true, 00:12:38.303 "compare": false, 00:12:38.303 "compare_and_write": false, 00:12:38.303 "abort": true, 00:12:38.303 "nvme_admin": false, 00:12:38.303 "nvme_io": false 00:12:38.303 }, 00:12:38.303 "memory_domains": [ 00:12:38.303 { 00:12:38.303 "dma_device_id": "system", 00:12:38.303 "dma_device_type": 1 00:12:38.303 }, 00:12:38.303 { 00:12:38.303 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:38.303 "dma_device_type": 2 00:12:38.303 } 00:12:38.303 ], 00:12:38.303 "driver_specific": {} 00:12:38.303 } 00:12:38.303 ] 00:12:38.303 15:51:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:38.303 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:12:38.303 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:12:38.303 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:12:38.562 [2024-06-10 15:51:43.899569] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:12:38.562 [2024-06-10 15:51:43.899605] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:12:38.562 [2024-06-10 15:51:43.899622] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:38.562 [2024-06-10 15:51:43.901025] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:38.562 15:51:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:38.821 15:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:38.821 "name": "Existed_Raid", 00:12:38.821 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:38.821 "strip_size_kb": 64, 00:12:38.821 "state": "configuring", 00:12:38.821 "raid_level": "raid0", 00:12:38.821 "superblock": true, 00:12:38.821 "num_base_bdevs": 3, 00:12:38.821 "num_base_bdevs_discovered": 2, 00:12:38.821 "num_base_bdevs_operational": 3, 00:12:38.821 "base_bdevs_list": [ 00:12:38.821 { 00:12:38.821 "name": "BaseBdev1", 00:12:38.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:38.821 "is_configured": false, 00:12:38.821 "data_offset": 0, 00:12:38.821 "data_size": 0 00:12:38.821 }, 00:12:38.821 { 00:12:38.821 "name": "BaseBdev2", 00:12:38.821 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:38.821 "is_configured": true, 00:12:38.821 "data_offset": 2048, 00:12:38.821 "data_size": 63488 00:12:38.821 }, 00:12:38.821 { 00:12:38.821 "name": "BaseBdev3", 00:12:38.821 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:38.821 "is_configured": true, 00:12:38.821 "data_offset": 2048, 00:12:38.821 "data_size": 63488 00:12:38.821 } 00:12:38.821 ] 00:12:38.821 }' 00:12:38.821 15:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:38.821 15:51:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:39.388 15:51:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:12:39.647 [2024-06-10 15:51:45.030580] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:39.647 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:39.906 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:39.906 "name": "Existed_Raid", 00:12:39.906 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:39.906 "strip_size_kb": 64, 00:12:39.906 "state": "configuring", 00:12:39.906 "raid_level": "raid0", 00:12:39.906 "superblock": true, 00:12:39.906 "num_base_bdevs": 3, 00:12:39.906 "num_base_bdevs_discovered": 1, 00:12:39.906 "num_base_bdevs_operational": 3, 00:12:39.906 "base_bdevs_list": [ 00:12:39.906 { 00:12:39.906 "name": "BaseBdev1", 00:12:39.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:12:39.906 "is_configured": false, 00:12:39.906 "data_offset": 0, 00:12:39.906 "data_size": 0 00:12:39.906 }, 00:12:39.906 { 00:12:39.906 "name": null, 00:12:39.906 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:39.906 "is_configured": false, 00:12:39.906 "data_offset": 2048, 00:12:39.906 "data_size": 63488 00:12:39.906 }, 00:12:39.906 { 00:12:39.906 "name": "BaseBdev3", 00:12:39.906 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:39.906 "is_configured": true, 00:12:39.906 "data_offset": 2048, 00:12:39.906 "data_size": 63488 00:12:39.906 } 00:12:39.906 ] 00:12:39.906 }' 00:12:39.906 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:39.906 15:51:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:40.474 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:40.474 15:51:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:40.733 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:12:40.733 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:12:40.992 [2024-06-10 15:51:46.441773] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:12:40.992 BaseBdev1 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:40.992 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:41.251 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:12:41.509 [ 00:12:41.509 { 00:12:41.509 "name": "BaseBdev1", 00:12:41.509 "aliases": [ 00:12:41.509 "1abab293-eb0f-4717-b233-717d03750317" 00:12:41.509 ], 00:12:41.509 "product_name": "Malloc disk", 00:12:41.509 "block_size": 512, 00:12:41.509 "num_blocks": 65536, 00:12:41.509 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:41.509 "assigned_rate_limits": { 00:12:41.509 "rw_ios_per_sec": 0, 00:12:41.509 "rw_mbytes_per_sec": 0, 00:12:41.509 "r_mbytes_per_sec": 0, 00:12:41.509 "w_mbytes_per_sec": 0 00:12:41.509 }, 00:12:41.509 "claimed": true, 00:12:41.509 "claim_type": "exclusive_write", 00:12:41.509 "zoned": false, 00:12:41.509 "supported_io_types": { 00:12:41.509 "read": true, 00:12:41.509 "write": true, 00:12:41.509 "unmap": true, 00:12:41.509 "write_zeroes": true, 00:12:41.509 "flush": true, 00:12:41.509 "reset": true, 00:12:41.509 "compare": false, 00:12:41.509 "compare_and_write": false, 00:12:41.510 "abort": true, 00:12:41.510 "nvme_admin": false, 00:12:41.510 "nvme_io": false 00:12:41.510 }, 00:12:41.510 "memory_domains": [ 00:12:41.510 { 00:12:41.510 "dma_device_id": "system", 00:12:41.510 "dma_device_type": 1 00:12:41.510 }, 00:12:41.510 { 00:12:41.510 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:41.510 "dma_device_type": 2 00:12:41.510 } 00:12:41.510 ], 00:12:41.510 "driver_specific": {} 00:12:41.510 } 00:12:41.510 ] 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:41.510 15:51:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:41.769 15:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:41.769 "name": "Existed_Raid", 00:12:41.769 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:41.769 "strip_size_kb": 64, 00:12:41.769 "state": "configuring", 00:12:41.769 "raid_level": "raid0", 00:12:41.769 "superblock": true, 00:12:41.769 "num_base_bdevs": 3, 00:12:41.769 "num_base_bdevs_discovered": 2, 00:12:41.769 "num_base_bdevs_operational": 3, 00:12:41.769 "base_bdevs_list": [ 00:12:41.769 { 00:12:41.769 "name": "BaseBdev1", 00:12:41.769 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:41.769 "is_configured": true, 00:12:41.769 "data_offset": 2048, 00:12:41.769 "data_size": 63488 00:12:41.769 }, 00:12:41.769 { 00:12:41.769 "name": null, 00:12:41.769 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:41.769 "is_configured": false, 00:12:41.769 "data_offset": 2048, 00:12:41.769 "data_size": 63488 00:12:41.769 }, 00:12:41.769 { 00:12:41.769 "name": "BaseBdev3", 00:12:41.769 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:41.769 "is_configured": true, 00:12:41.769 "data_offset": 2048, 00:12:41.769 "data_size": 63488 00:12:41.769 } 00:12:41.769 ] 00:12:41.769 }' 00:12:41.769 15:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:41.769 15:51:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:42.337 15:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.337 15:51:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:42.596 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:12:42.596 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:12:42.855 [2024-06-10 15:51:48.326841] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:42.855 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:43.114 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:43.114 "name": "Existed_Raid", 00:12:43.115 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:43.115 "strip_size_kb": 64, 00:12:43.115 "state": "configuring", 00:12:43.115 "raid_level": "raid0", 00:12:43.115 "superblock": true, 00:12:43.115 "num_base_bdevs": 3, 00:12:43.115 "num_base_bdevs_discovered": 1, 00:12:43.115 "num_base_bdevs_operational": 3, 00:12:43.115 "base_bdevs_list": [ 00:12:43.115 { 00:12:43.115 "name": "BaseBdev1", 00:12:43.115 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:43.115 "is_configured": true, 00:12:43.115 "data_offset": 2048, 00:12:43.115 "data_size": 63488 00:12:43.115 }, 00:12:43.115 { 00:12:43.115 "name": null, 00:12:43.115 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:43.115 "is_configured": false, 00:12:43.115 "data_offset": 2048, 00:12:43.115 "data_size": 63488 00:12:43.115 }, 00:12:43.115 { 00:12:43.115 "name": null, 00:12:43.115 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:43.115 "is_configured": false, 00:12:43.115 "data_offset": 2048, 00:12:43.115 "data_size": 63488 00:12:43.115 } 00:12:43.115 ] 00:12:43.115 }' 00:12:43.115 15:51:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:43.115 15:51:48 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:44.051 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.051 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:44.051 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:12:44.051 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:12:44.311 [2024-06-10 15:51:49.730620] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:44.311 15:51:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:44.570 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:44.570 "name": "Existed_Raid", 00:12:44.570 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:44.570 "strip_size_kb": 64, 00:12:44.570 "state": "configuring", 00:12:44.570 "raid_level": "raid0", 00:12:44.570 "superblock": true, 00:12:44.570 "num_base_bdevs": 3, 00:12:44.570 "num_base_bdevs_discovered": 2, 00:12:44.570 "num_base_bdevs_operational": 3, 00:12:44.570 "base_bdevs_list": [ 00:12:44.570 { 00:12:44.570 "name": "BaseBdev1", 00:12:44.570 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:44.570 "is_configured": true, 00:12:44.570 "data_offset": 2048, 00:12:44.570 "data_size": 63488 00:12:44.570 }, 00:12:44.570 { 00:12:44.570 "name": null, 00:12:44.570 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:44.570 "is_configured": false, 00:12:44.570 "data_offset": 2048, 00:12:44.570 "data_size": 63488 00:12:44.570 }, 00:12:44.570 { 00:12:44.570 "name": "BaseBdev3", 00:12:44.570 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:44.570 "is_configured": true, 00:12:44.570 "data_offset": 2048, 00:12:44.570 "data_size": 63488 00:12:44.570 } 00:12:44.570 ] 00:12:44.570 }' 00:12:44.570 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:44.570 15:51:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:45.208 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.208 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:12:45.466 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:12:45.466 15:51:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:12:45.725 [2024-06-10 15:51:51.118359] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:45.725 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:45.985 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:45.985 "name": "Existed_Raid", 00:12:45.985 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:45.985 "strip_size_kb": 64, 00:12:45.985 "state": "configuring", 00:12:45.985 "raid_level": "raid0", 00:12:45.985 "superblock": true, 00:12:45.985 "num_base_bdevs": 3, 00:12:45.985 "num_base_bdevs_discovered": 1, 00:12:45.985 "num_base_bdevs_operational": 3, 00:12:45.985 "base_bdevs_list": [ 00:12:45.985 { 00:12:45.985 "name": null, 00:12:45.985 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:45.985 "is_configured": false, 00:12:45.985 "data_offset": 2048, 00:12:45.985 "data_size": 63488 00:12:45.985 }, 00:12:45.985 { 00:12:45.985 "name": null, 00:12:45.985 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:45.985 "is_configured": false, 00:12:45.985 "data_offset": 2048, 00:12:45.985 "data_size": 63488 00:12:45.985 }, 00:12:45.985 { 00:12:45.985 "name": "BaseBdev3", 00:12:45.985 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:45.985 "is_configured": true, 00:12:45.985 "data_offset": 2048, 00:12:45.985 "data_size": 63488 00:12:45.985 } 00:12:45.985 ] 00:12:45.985 }' 00:12:45.985 15:51:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:45.985 15:51:51 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:46.553 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:46.553 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:12:46.811 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:12:46.811 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:12:47.070 [2024-06-10 15:51:52.520673] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 3 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:47.070 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:47.329 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:47.329 "name": "Existed_Raid", 00:12:47.329 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:47.329 "strip_size_kb": 64, 00:12:47.329 "state": "configuring", 00:12:47.329 "raid_level": "raid0", 00:12:47.329 "superblock": true, 00:12:47.329 "num_base_bdevs": 3, 00:12:47.329 "num_base_bdevs_discovered": 2, 00:12:47.329 "num_base_bdevs_operational": 3, 00:12:47.329 "base_bdevs_list": [ 00:12:47.329 { 00:12:47.329 "name": null, 00:12:47.329 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:47.329 "is_configured": false, 00:12:47.329 "data_offset": 2048, 00:12:47.329 "data_size": 63488 00:12:47.329 }, 00:12:47.329 { 00:12:47.329 "name": "BaseBdev2", 00:12:47.329 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:47.329 "is_configured": true, 00:12:47.329 "data_offset": 2048, 00:12:47.329 "data_size": 63488 00:12:47.329 }, 00:12:47.329 { 00:12:47.329 "name": "BaseBdev3", 00:12:47.329 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:47.329 "is_configured": true, 00:12:47.329 "data_offset": 2048, 00:12:47.329 "data_size": 63488 00:12:47.329 } 00:12:47.329 ] 00:12:47.329 }' 00:12:47.329 15:51:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:47.329 15:51:52 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:48.265 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.265 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:12:48.265 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:12:48.265 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:48.265 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:12:48.524 15:51:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1abab293-eb0f-4717-b233-717d03750317 00:12:48.783 [2024-06-10 15:51:54.180295] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:12:48.783 [2024-06-10 15:51:54.180438] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1b266f0 00:12:48.783 [2024-06-10 15:51:54.180449] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:48.783 [2024-06-10 15:51:54.180636] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1b246d0 00:12:48.783 [2024-06-10 15:51:54.180753] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1b266f0 00:12:48.783 [2024-06-10 15:51:54.180761] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1b266f0 00:12:48.783 [2024-06-10 15:51:54.180853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:48.783 NewBaseBdev 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:12:48.783 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:12:49.043 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:12:49.302 [ 00:12:49.302 { 00:12:49.302 "name": "NewBaseBdev", 00:12:49.302 "aliases": [ 00:12:49.302 "1abab293-eb0f-4717-b233-717d03750317" 00:12:49.302 ], 00:12:49.302 "product_name": "Malloc disk", 00:12:49.302 "block_size": 512, 00:12:49.302 "num_blocks": 65536, 00:12:49.302 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:49.302 "assigned_rate_limits": { 00:12:49.302 "rw_ios_per_sec": 0, 00:12:49.302 "rw_mbytes_per_sec": 0, 00:12:49.302 "r_mbytes_per_sec": 0, 00:12:49.302 "w_mbytes_per_sec": 0 00:12:49.302 }, 00:12:49.302 "claimed": true, 00:12:49.302 "claim_type": "exclusive_write", 00:12:49.302 "zoned": false, 00:12:49.302 "supported_io_types": { 00:12:49.302 "read": true, 00:12:49.302 "write": true, 00:12:49.302 "unmap": true, 00:12:49.302 "write_zeroes": true, 00:12:49.302 "flush": true, 00:12:49.302 "reset": true, 00:12:49.302 "compare": false, 00:12:49.302 "compare_and_write": false, 00:12:49.302 "abort": true, 00:12:49.302 "nvme_admin": false, 00:12:49.302 "nvme_io": false 00:12:49.302 }, 00:12:49.302 "memory_domains": [ 00:12:49.302 { 00:12:49.302 "dma_device_id": "system", 00:12:49.302 "dma_device_type": 1 00:12:49.302 }, 00:12:49.302 { 00:12:49.302 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:49.302 "dma_device_type": 2 00:12:49.302 } 00:12:49.302 ], 00:12:49.302 "driver_specific": {} 00:12:49.302 } 00:12:49.302 ] 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 3 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:49.302 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:12:49.561 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:49.561 "name": "Existed_Raid", 00:12:49.561 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:49.561 "strip_size_kb": 64, 00:12:49.562 "state": "online", 00:12:49.562 "raid_level": "raid0", 00:12:49.562 "superblock": true, 00:12:49.562 "num_base_bdevs": 3, 00:12:49.562 "num_base_bdevs_discovered": 3, 00:12:49.562 "num_base_bdevs_operational": 3, 00:12:49.562 "base_bdevs_list": [ 00:12:49.562 { 00:12:49.562 "name": "NewBaseBdev", 00:12:49.562 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:49.562 "is_configured": true, 00:12:49.562 "data_offset": 2048, 00:12:49.562 "data_size": 63488 00:12:49.562 }, 00:12:49.562 { 00:12:49.562 "name": "BaseBdev2", 00:12:49.562 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:49.562 "is_configured": true, 00:12:49.562 "data_offset": 2048, 00:12:49.562 "data_size": 63488 00:12:49.562 }, 00:12:49.562 { 00:12:49.562 "name": "BaseBdev3", 00:12:49.562 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:49.562 "is_configured": true, 00:12:49.562 "data_offset": 2048, 00:12:49.562 "data_size": 63488 00:12:49.562 } 00:12:49.562 ] 00:12:49.562 }' 00:12:49.562 15:51:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:49.562 15:51:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:12:50.130 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:50.389 [2024-06-10 15:51:55.788907] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:50.389 "name": "Existed_Raid", 00:12:50.389 "aliases": [ 00:12:50.389 "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4" 00:12:50.389 ], 00:12:50.389 "product_name": "Raid Volume", 00:12:50.389 "block_size": 512, 00:12:50.389 "num_blocks": 190464, 00:12:50.389 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:50.389 "assigned_rate_limits": { 00:12:50.389 "rw_ios_per_sec": 0, 00:12:50.389 "rw_mbytes_per_sec": 0, 00:12:50.389 "r_mbytes_per_sec": 0, 00:12:50.389 "w_mbytes_per_sec": 0 00:12:50.389 }, 00:12:50.389 "claimed": false, 00:12:50.389 "zoned": false, 00:12:50.389 "supported_io_types": { 00:12:50.389 "read": true, 00:12:50.389 "write": true, 00:12:50.389 "unmap": true, 00:12:50.389 "write_zeroes": true, 00:12:50.389 "flush": true, 00:12:50.389 "reset": true, 00:12:50.389 "compare": false, 00:12:50.389 "compare_and_write": false, 00:12:50.389 "abort": false, 00:12:50.389 "nvme_admin": false, 00:12:50.389 "nvme_io": false 00:12:50.389 }, 00:12:50.389 "memory_domains": [ 00:12:50.389 { 00:12:50.389 "dma_device_id": "system", 00:12:50.389 "dma_device_type": 1 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.389 "dma_device_type": 2 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "dma_device_id": "system", 00:12:50.389 "dma_device_type": 1 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.389 "dma_device_type": 2 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "dma_device_id": "system", 00:12:50.389 "dma_device_type": 1 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.389 "dma_device_type": 2 00:12:50.389 } 00:12:50.389 ], 00:12:50.389 "driver_specific": { 00:12:50.389 "raid": { 00:12:50.389 "uuid": "8ba91a1d-1f0c-431e-abfc-3e921a3cafe4", 00:12:50.389 "strip_size_kb": 64, 00:12:50.389 "state": "online", 00:12:50.389 "raid_level": "raid0", 00:12:50.389 "superblock": true, 00:12:50.389 "num_base_bdevs": 3, 00:12:50.389 "num_base_bdevs_discovered": 3, 00:12:50.389 "num_base_bdevs_operational": 3, 00:12:50.389 "base_bdevs_list": [ 00:12:50.389 { 00:12:50.389 "name": "NewBaseBdev", 00:12:50.389 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:50.389 "is_configured": true, 00:12:50.389 "data_offset": 2048, 00:12:50.389 "data_size": 63488 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "name": "BaseBdev2", 00:12:50.389 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:50.389 "is_configured": true, 00:12:50.389 "data_offset": 2048, 00:12:50.389 "data_size": 63488 00:12:50.389 }, 00:12:50.389 { 00:12:50.389 "name": "BaseBdev3", 00:12:50.389 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:50.389 "is_configured": true, 00:12:50.389 "data_offset": 2048, 00:12:50.389 "data_size": 63488 00:12:50.389 } 00:12:50.389 ] 00:12:50.389 } 00:12:50.389 } 00:12:50.389 }' 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:12:50.389 BaseBdev2 00:12:50.389 BaseBdev3' 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:12:50.389 15:51:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:50.648 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:50.648 "name": "NewBaseBdev", 00:12:50.648 "aliases": [ 00:12:50.648 "1abab293-eb0f-4717-b233-717d03750317" 00:12:50.648 ], 00:12:50.648 "product_name": "Malloc disk", 00:12:50.648 "block_size": 512, 00:12:50.648 "num_blocks": 65536, 00:12:50.648 "uuid": "1abab293-eb0f-4717-b233-717d03750317", 00:12:50.648 "assigned_rate_limits": { 00:12:50.648 "rw_ios_per_sec": 0, 00:12:50.648 "rw_mbytes_per_sec": 0, 00:12:50.648 "r_mbytes_per_sec": 0, 00:12:50.648 "w_mbytes_per_sec": 0 00:12:50.648 }, 00:12:50.648 "claimed": true, 00:12:50.648 "claim_type": "exclusive_write", 00:12:50.648 "zoned": false, 00:12:50.648 "supported_io_types": { 00:12:50.648 "read": true, 00:12:50.648 "write": true, 00:12:50.648 "unmap": true, 00:12:50.648 "write_zeroes": true, 00:12:50.648 "flush": true, 00:12:50.648 "reset": true, 00:12:50.648 "compare": false, 00:12:50.648 "compare_and_write": false, 00:12:50.648 "abort": true, 00:12:50.648 "nvme_admin": false, 00:12:50.648 "nvme_io": false 00:12:50.648 }, 00:12:50.648 "memory_domains": [ 00:12:50.648 { 00:12:50.648 "dma_device_id": "system", 00:12:50.648 "dma_device_type": 1 00:12:50.648 }, 00:12:50.648 { 00:12:50.648 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:50.648 "dma_device_type": 2 00:12:50.648 } 00:12:50.648 ], 00:12:50.648 "driver_specific": {} 00:12:50.648 }' 00:12:50.648 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:50.907 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:51.166 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:51.166 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:51.166 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:51.166 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:12:51.166 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:51.425 "name": "BaseBdev2", 00:12:51.425 "aliases": [ 00:12:51.425 "907612c7-8045-477a-b823-2167e0843d2f" 00:12:51.425 ], 00:12:51.425 "product_name": "Malloc disk", 00:12:51.425 "block_size": 512, 00:12:51.425 "num_blocks": 65536, 00:12:51.425 "uuid": "907612c7-8045-477a-b823-2167e0843d2f", 00:12:51.425 "assigned_rate_limits": { 00:12:51.425 "rw_ios_per_sec": 0, 00:12:51.425 "rw_mbytes_per_sec": 0, 00:12:51.425 "r_mbytes_per_sec": 0, 00:12:51.425 "w_mbytes_per_sec": 0 00:12:51.425 }, 00:12:51.425 "claimed": true, 00:12:51.425 "claim_type": "exclusive_write", 00:12:51.425 "zoned": false, 00:12:51.425 "supported_io_types": { 00:12:51.425 "read": true, 00:12:51.425 "write": true, 00:12:51.425 "unmap": true, 00:12:51.425 "write_zeroes": true, 00:12:51.425 "flush": true, 00:12:51.425 "reset": true, 00:12:51.425 "compare": false, 00:12:51.425 "compare_and_write": false, 00:12:51.425 "abort": true, 00:12:51.425 "nvme_admin": false, 00:12:51.425 "nvme_io": false 00:12:51.425 }, 00:12:51.425 "memory_domains": [ 00:12:51.425 { 00:12:51.425 "dma_device_id": "system", 00:12:51.425 "dma_device_type": 1 00:12:51.425 }, 00:12:51.425 { 00:12:51.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.425 "dma_device_type": 2 00:12:51.425 } 00:12:51.425 ], 00:12:51.425 "driver_specific": {} 00:12:51.425 }' 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:51.425 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:51.684 15:51:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:12:51.684 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:51.943 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:51.943 "name": "BaseBdev3", 00:12:51.943 "aliases": [ 00:12:51.943 "fd257a8b-539e-443e-8076-d42413a98818" 00:12:51.943 ], 00:12:51.943 "product_name": "Malloc disk", 00:12:51.943 "block_size": 512, 00:12:51.943 "num_blocks": 65536, 00:12:51.943 "uuid": "fd257a8b-539e-443e-8076-d42413a98818", 00:12:51.943 "assigned_rate_limits": { 00:12:51.943 "rw_ios_per_sec": 0, 00:12:51.943 "rw_mbytes_per_sec": 0, 00:12:51.943 "r_mbytes_per_sec": 0, 00:12:51.943 "w_mbytes_per_sec": 0 00:12:51.943 }, 00:12:51.943 "claimed": true, 00:12:51.943 "claim_type": "exclusive_write", 00:12:51.943 "zoned": false, 00:12:51.943 "supported_io_types": { 00:12:51.943 "read": true, 00:12:51.943 "write": true, 00:12:51.943 "unmap": true, 00:12:51.943 "write_zeroes": true, 00:12:51.943 "flush": true, 00:12:51.943 "reset": true, 00:12:51.943 "compare": false, 00:12:51.943 "compare_and_write": false, 00:12:51.943 "abort": true, 00:12:51.943 "nvme_admin": false, 00:12:51.943 "nvme_io": false 00:12:51.943 }, 00:12:51.943 "memory_domains": [ 00:12:51.943 { 00:12:51.943 "dma_device_id": "system", 00:12:51.943 "dma_device_type": 1 00:12:51.943 }, 00:12:51.943 { 00:12:51.943 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:51.943 "dma_device_type": 2 00:12:51.943 } 00:12:51.943 ], 00:12:51.943 "driver_specific": {} 00:12:51.943 }' 00:12:51.943 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:51.943 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.202 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:52.460 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:52.460 15:51:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:12:52.719 [2024-06-10 15:51:57.978523] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:12:52.719 [2024-06-10 15:51:57.978546] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:52.719 [2024-06-10 15:51:57.978596] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:52.719 [2024-06-10 15:51:57.978646] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:52.719 [2024-06-10 15:51:57.978655] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1b266f0 name Existed_Raid, state offline 00:12:52.719 15:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2662281 00:12:52.719 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2662281 ']' 00:12:52.719 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2662281 00:12:52.719 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:12:52.719 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2662281 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2662281' 00:12:52.720 killing process with pid 2662281 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2662281 00:12:52.720 [2024-06-10 15:51:58.046306] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:12:52.720 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2662281 00:12:52.720 [2024-06-10 15:51:58.071036] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:12:52.979 15:51:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:12:52.979 00:12:52.979 real 0m29.777s 00:12:52.979 user 0m55.758s 00:12:52.979 sys 0m4.233s 00:12:52.979 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:12:52.979 15:51:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:12:52.979 ************************************ 00:12:52.979 END TEST raid_state_function_test_sb 00:12:52.979 ************************************ 00:12:52.979 15:51:58 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 3 00:12:52.979 15:51:58 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:12:52.979 15:51:58 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:12:52.979 15:51:58 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:12:52.979 ************************************ 00:12:52.979 START TEST raid_superblock_test 00:12:52.979 ************************************ 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 3 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2667793 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2667793 /var/tmp/spdk-raid.sock 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2667793 ']' 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:12:52.979 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:12:52.979 15:51:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:52.979 [2024-06-10 15:51:58.399330] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:12:52.979 [2024-06-10 15:51:58.399385] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2667793 ] 00:12:53.238 [2024-06-10 15:51:58.498801] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:12:53.238 [2024-06-10 15:51:58.588604] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:12:53.238 [2024-06-10 15:51:58.647344] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.238 [2024-06-10 15:51:58.647372] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:53.806 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:12:54.065 malloc1 00:12:54.065 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:12:54.324 [2024-06-10 15:51:59.765787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:12:54.324 [2024-06-10 15:51:59.765834] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.324 [2024-06-10 15:51:59.765850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe710f0 00:12:54.324 [2024-06-10 15:51:59.765865] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.324 [2024-06-10 15:51:59.767523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.324 [2024-06-10 15:51:59.767552] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:12:54.324 pt1 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:54.324 15:51:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:12:54.583 malloc2 00:12:54.583 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:12:54.841 [2024-06-10 15:52:00.291995] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:12:54.841 [2024-06-10 15:52:00.292038] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:54.841 [2024-06-10 15:52:00.292052] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe72400 00:12:54.841 [2024-06-10 15:52:00.292061] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:54.842 [2024-06-10 15:52:00.293576] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:54.842 [2024-06-10 15:52:00.293602] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:12:54.842 pt2 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:12:54.842 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:12:55.101 malloc3 00:12:55.101 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:12:55.360 [2024-06-10 15:52:00.813723] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:12:55.360 [2024-06-10 15:52:00.813764] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:12:55.360 [2024-06-10 15:52:00.813777] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101e200 00:12:55.360 [2024-06-10 15:52:00.813787] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:12:55.360 [2024-06-10 15:52:00.815266] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:12:55.360 [2024-06-10 15:52:00.815293] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:12:55.360 pt3 00:12:55.360 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:12:55.360 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:12:55.360 15:52:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:12:55.619 [2024-06-10 15:52:01.074422] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:12:55.619 [2024-06-10 15:52:01.075648] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:12:55.619 [2024-06-10 15:52:01.075702] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:12:55.619 [2024-06-10 15:52:01.075858] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101c9f0 00:12:55.619 [2024-06-10 15:52:01.075869] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:12:55.619 [2024-06-10 15:52:01.076058] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x101d830 00:12:55.619 [2024-06-10 15:52:01.076198] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101c9f0 00:12:55.619 [2024-06-10 15:52:01.076206] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101c9f0 00:12:55.619 [2024-06-10 15:52:01.076297] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:55.619 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:12:55.878 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:12:55.878 "name": "raid_bdev1", 00:12:55.878 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:12:55.878 "strip_size_kb": 64, 00:12:55.878 "state": "online", 00:12:55.878 "raid_level": "raid0", 00:12:55.878 "superblock": true, 00:12:55.878 "num_base_bdevs": 3, 00:12:55.878 "num_base_bdevs_discovered": 3, 00:12:55.878 "num_base_bdevs_operational": 3, 00:12:55.878 "base_bdevs_list": [ 00:12:55.878 { 00:12:55.878 "name": "pt1", 00:12:55.878 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:55.878 "is_configured": true, 00:12:55.878 "data_offset": 2048, 00:12:55.878 "data_size": 63488 00:12:55.878 }, 00:12:55.878 { 00:12:55.878 "name": "pt2", 00:12:55.878 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:55.878 "is_configured": true, 00:12:55.878 "data_offset": 2048, 00:12:55.878 "data_size": 63488 00:12:55.878 }, 00:12:55.878 { 00:12:55.878 "name": "pt3", 00:12:55.878 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:55.878 "is_configured": true, 00:12:55.878 "data_offset": 2048, 00:12:55.878 "data_size": 63488 00:12:55.878 } 00:12:55.878 ] 00:12:55.878 }' 00:12:55.878 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:12:55.878 15:52:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:56.816 15:52:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:12:56.816 [2024-06-10 15:52:02.217726] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:12:56.816 "name": "raid_bdev1", 00:12:56.816 "aliases": [ 00:12:56.816 "95eec627-628c-43ed-a5f1-fe7a51e5caa2" 00:12:56.816 ], 00:12:56.816 "product_name": "Raid Volume", 00:12:56.816 "block_size": 512, 00:12:56.816 "num_blocks": 190464, 00:12:56.816 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:12:56.816 "assigned_rate_limits": { 00:12:56.816 "rw_ios_per_sec": 0, 00:12:56.816 "rw_mbytes_per_sec": 0, 00:12:56.816 "r_mbytes_per_sec": 0, 00:12:56.816 "w_mbytes_per_sec": 0 00:12:56.816 }, 00:12:56.816 "claimed": false, 00:12:56.816 "zoned": false, 00:12:56.816 "supported_io_types": { 00:12:56.816 "read": true, 00:12:56.816 "write": true, 00:12:56.816 "unmap": true, 00:12:56.816 "write_zeroes": true, 00:12:56.816 "flush": true, 00:12:56.816 "reset": true, 00:12:56.816 "compare": false, 00:12:56.816 "compare_and_write": false, 00:12:56.816 "abort": false, 00:12:56.816 "nvme_admin": false, 00:12:56.816 "nvme_io": false 00:12:56.816 }, 00:12:56.816 "memory_domains": [ 00:12:56.816 { 00:12:56.816 "dma_device_id": "system", 00:12:56.816 "dma_device_type": 1 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.816 "dma_device_type": 2 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "dma_device_id": "system", 00:12:56.816 "dma_device_type": 1 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.816 "dma_device_type": 2 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "dma_device_id": "system", 00:12:56.816 "dma_device_type": 1 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:56.816 "dma_device_type": 2 00:12:56.816 } 00:12:56.816 ], 00:12:56.816 "driver_specific": { 00:12:56.816 "raid": { 00:12:56.816 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:12:56.816 "strip_size_kb": 64, 00:12:56.816 "state": "online", 00:12:56.816 "raid_level": "raid0", 00:12:56.816 "superblock": true, 00:12:56.816 "num_base_bdevs": 3, 00:12:56.816 "num_base_bdevs_discovered": 3, 00:12:56.816 "num_base_bdevs_operational": 3, 00:12:56.816 "base_bdevs_list": [ 00:12:56.816 { 00:12:56.816 "name": "pt1", 00:12:56.816 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:56.816 "is_configured": true, 00:12:56.816 "data_offset": 2048, 00:12:56.816 "data_size": 63488 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "name": "pt2", 00:12:56.816 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:56.816 "is_configured": true, 00:12:56.816 "data_offset": 2048, 00:12:56.816 "data_size": 63488 00:12:56.816 }, 00:12:56.816 { 00:12:56.816 "name": "pt3", 00:12:56.816 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:56.816 "is_configured": true, 00:12:56.816 "data_offset": 2048, 00:12:56.816 "data_size": 63488 00:12:56.816 } 00:12:56.816 ] 00:12:56.816 } 00:12:56.816 } 00:12:56.816 }' 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:12:56.816 pt2 00:12:56.816 pt3' 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:12:56.816 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.075 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.075 "name": "pt1", 00:12:57.075 "aliases": [ 00:12:57.075 "00000000-0000-0000-0000-000000000001" 00:12:57.075 ], 00:12:57.075 "product_name": "passthru", 00:12:57.075 "block_size": 512, 00:12:57.075 "num_blocks": 65536, 00:12:57.075 "uuid": "00000000-0000-0000-0000-000000000001", 00:12:57.075 "assigned_rate_limits": { 00:12:57.075 "rw_ios_per_sec": 0, 00:12:57.075 "rw_mbytes_per_sec": 0, 00:12:57.075 "r_mbytes_per_sec": 0, 00:12:57.075 "w_mbytes_per_sec": 0 00:12:57.075 }, 00:12:57.075 "claimed": true, 00:12:57.075 "claim_type": "exclusive_write", 00:12:57.075 "zoned": false, 00:12:57.075 "supported_io_types": { 00:12:57.075 "read": true, 00:12:57.075 "write": true, 00:12:57.075 "unmap": true, 00:12:57.075 "write_zeroes": true, 00:12:57.075 "flush": true, 00:12:57.075 "reset": true, 00:12:57.075 "compare": false, 00:12:57.075 "compare_and_write": false, 00:12:57.075 "abort": true, 00:12:57.075 "nvme_admin": false, 00:12:57.075 "nvme_io": false 00:12:57.075 }, 00:12:57.075 "memory_domains": [ 00:12:57.075 { 00:12:57.075 "dma_device_id": "system", 00:12:57.075 "dma_device_type": 1 00:12:57.075 }, 00:12:57.075 { 00:12:57.075 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.075 "dma_device_type": 2 00:12:57.075 } 00:12:57.075 ], 00:12:57.075 "driver_specific": { 00:12:57.075 "passthru": { 00:12:57.075 "name": "pt1", 00:12:57.075 "base_bdev_name": "malloc1" 00:12:57.075 } 00:12:57.075 } 00:12:57.075 }' 00:12:57.075 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:57.334 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.593 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:57.593 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:57.593 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:57.593 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:12:57.593 15:52:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:57.852 "name": "pt2", 00:12:57.852 "aliases": [ 00:12:57.852 "00000000-0000-0000-0000-000000000002" 00:12:57.852 ], 00:12:57.852 "product_name": "passthru", 00:12:57.852 "block_size": 512, 00:12:57.852 "num_blocks": 65536, 00:12:57.852 "uuid": "00000000-0000-0000-0000-000000000002", 00:12:57.852 "assigned_rate_limits": { 00:12:57.852 "rw_ios_per_sec": 0, 00:12:57.852 "rw_mbytes_per_sec": 0, 00:12:57.852 "r_mbytes_per_sec": 0, 00:12:57.852 "w_mbytes_per_sec": 0 00:12:57.852 }, 00:12:57.852 "claimed": true, 00:12:57.852 "claim_type": "exclusive_write", 00:12:57.852 "zoned": false, 00:12:57.852 "supported_io_types": { 00:12:57.852 "read": true, 00:12:57.852 "write": true, 00:12:57.852 "unmap": true, 00:12:57.852 "write_zeroes": true, 00:12:57.852 "flush": true, 00:12:57.852 "reset": true, 00:12:57.852 "compare": false, 00:12:57.852 "compare_and_write": false, 00:12:57.852 "abort": true, 00:12:57.852 "nvme_admin": false, 00:12:57.852 "nvme_io": false 00:12:57.852 }, 00:12:57.852 "memory_domains": [ 00:12:57.852 { 00:12:57.852 "dma_device_id": "system", 00:12:57.852 "dma_device_type": 1 00:12:57.852 }, 00:12:57.852 { 00:12:57.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:57.852 "dma_device_type": 2 00:12:57.852 } 00:12:57.852 ], 00:12:57.852 "driver_specific": { 00:12:57.852 "passthru": { 00:12:57.852 "name": "pt2", 00:12:57.852 "base_bdev_name": "malloc2" 00:12:57.852 } 00:12:57.852 } 00:12:57.852 }' 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:57.852 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:12:58.111 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:12:58.370 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:12:58.370 "name": "pt3", 00:12:58.370 "aliases": [ 00:12:58.370 "00000000-0000-0000-0000-000000000003" 00:12:58.370 ], 00:12:58.370 "product_name": "passthru", 00:12:58.370 "block_size": 512, 00:12:58.370 "num_blocks": 65536, 00:12:58.370 "uuid": "00000000-0000-0000-0000-000000000003", 00:12:58.370 "assigned_rate_limits": { 00:12:58.370 "rw_ios_per_sec": 0, 00:12:58.370 "rw_mbytes_per_sec": 0, 00:12:58.370 "r_mbytes_per_sec": 0, 00:12:58.370 "w_mbytes_per_sec": 0 00:12:58.370 }, 00:12:58.370 "claimed": true, 00:12:58.370 "claim_type": "exclusive_write", 00:12:58.370 "zoned": false, 00:12:58.370 "supported_io_types": { 00:12:58.370 "read": true, 00:12:58.370 "write": true, 00:12:58.370 "unmap": true, 00:12:58.370 "write_zeroes": true, 00:12:58.370 "flush": true, 00:12:58.370 "reset": true, 00:12:58.370 "compare": false, 00:12:58.370 "compare_and_write": false, 00:12:58.370 "abort": true, 00:12:58.370 "nvme_admin": false, 00:12:58.370 "nvme_io": false 00:12:58.370 }, 00:12:58.370 "memory_domains": [ 00:12:58.370 { 00:12:58.370 "dma_device_id": "system", 00:12:58.370 "dma_device_type": 1 00:12:58.370 }, 00:12:58.370 { 00:12:58.370 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:12:58.370 "dma_device_type": 2 00:12:58.370 } 00:12:58.370 ], 00:12:58.370 "driver_specific": { 00:12:58.370 "passthru": { 00:12:58.370 "name": "pt3", 00:12:58.370 "base_bdev_name": "malloc3" 00:12:58.370 } 00:12:58.370 } 00:12:58.370 }' 00:12:58.370 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:58.370 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:12:58.629 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:12:58.629 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:58.629 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:12:58.629 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:12:58.629 15:52:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.629 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:12:58.629 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:12:58.629 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.629 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:12:58.889 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:12:58.889 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:12:58.889 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:12:58.889 [2024-06-10 15:52:04.387495] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:12:59.180 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=95eec627-628c-43ed-a5f1-fe7a51e5caa2 00:12:59.180 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 95eec627-628c-43ed-a5f1-fe7a51e5caa2 ']' 00:12:59.180 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:12:59.180 [2024-06-10 15:52:04.639910] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:12:59.180 [2024-06-10 15:52:04.639931] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:12:59.180 [2024-06-10 15:52:04.639991] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:12:59.180 [2024-06-10 15:52:04.640045] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:12:59.180 [2024-06-10 15:52:04.640054] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101c9f0 name raid_bdev1, state offline 00:12:59.180 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:12:59.180 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:12:59.438 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:12:59.438 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:12:59.438 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:59.438 15:52:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:12:59.697 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:59.697 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:12:59.955 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:12:59.955 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:13:00.214 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:13:00.214 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:13:00.473 15:52:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:13:00.733 [2024-06-10 15:52:06.163910] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:13:00.733 [2024-06-10 15:52:06.165342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:13:00.733 [2024-06-10 15:52:06.165387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:13:00.733 [2024-06-10 15:52:06.165434] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:13:00.733 [2024-06-10 15:52:06.165472] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:13:00.733 [2024-06-10 15:52:06.165492] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:13:00.733 [2024-06-10 15:52:06.165513] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:00.733 [2024-06-10 15:52:06.165522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101a3a0 name raid_bdev1, state configuring 00:13:00.733 request: 00:13:00.733 { 00:13:00.733 "name": "raid_bdev1", 00:13:00.733 "raid_level": "raid0", 00:13:00.733 "base_bdevs": [ 00:13:00.733 "malloc1", 00:13:00.733 "malloc2", 00:13:00.733 "malloc3" 00:13:00.733 ], 00:13:00.733 "superblock": false, 00:13:00.733 "strip_size_kb": 64, 00:13:00.733 "method": "bdev_raid_create", 00:13:00.733 "req_id": 1 00:13:00.733 } 00:13:00.733 Got JSON-RPC error response 00:13:00.733 response: 00:13:00.733 { 00:13:00.733 "code": -17, 00:13:00.733 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:13:00.733 } 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:00.733 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:13:00.992 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:13:00.992 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:13:00.992 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:13:01.250 [2024-06-10 15:52:06.661175] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:13:01.250 [2024-06-10 15:52:06.661218] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:01.250 [2024-06-10 15:52:06.661233] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1020320 00:13:01.250 [2024-06-10 15:52:06.661243] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:01.250 [2024-06-10 15:52:06.662930] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:01.250 [2024-06-10 15:52:06.662964] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:13:01.250 [2024-06-10 15:52:06.663028] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:13:01.250 [2024-06-10 15:52:06.663054] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:13:01.250 pt1 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:01.250 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:01.509 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:01.509 "name": "raid_bdev1", 00:13:01.509 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:13:01.509 "strip_size_kb": 64, 00:13:01.509 "state": "configuring", 00:13:01.509 "raid_level": "raid0", 00:13:01.509 "superblock": true, 00:13:01.509 "num_base_bdevs": 3, 00:13:01.509 "num_base_bdevs_discovered": 1, 00:13:01.509 "num_base_bdevs_operational": 3, 00:13:01.509 "base_bdevs_list": [ 00:13:01.509 { 00:13:01.509 "name": "pt1", 00:13:01.509 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:01.509 "is_configured": true, 00:13:01.509 "data_offset": 2048, 00:13:01.509 "data_size": 63488 00:13:01.509 }, 00:13:01.509 { 00:13:01.509 "name": null, 00:13:01.509 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:01.509 "is_configured": false, 00:13:01.509 "data_offset": 2048, 00:13:01.509 "data_size": 63488 00:13:01.509 }, 00:13:01.509 { 00:13:01.509 "name": null, 00:13:01.509 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:01.509 "is_configured": false, 00:13:01.509 "data_offset": 2048, 00:13:01.509 "data_size": 63488 00:13:01.509 } 00:13:01.509 ] 00:13:01.509 }' 00:13:01.509 15:52:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:01.509 15:52:06 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:02.077 15:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:13:02.077 15:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:02.336 [2024-06-10 15:52:07.776331] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:02.336 [2024-06-10 15:52:07.776377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:02.336 [2024-06-10 15:52:07.776391] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x101bb20 00:13:02.336 [2024-06-10 15:52:07.776400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:02.336 [2024-06-10 15:52:07.776747] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:02.336 [2024-06-10 15:52:07.776763] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:02.336 [2024-06-10 15:52:07.776821] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:02.336 [2024-06-10 15:52:07.776839] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:02.336 pt2 00:13:02.336 15:52:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:13:02.594 [2024-06-10 15:52:08.029017] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 3 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:02.594 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:02.595 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:02.595 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:02.853 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:02.853 "name": "raid_bdev1", 00:13:02.853 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:13:02.853 "strip_size_kb": 64, 00:13:02.853 "state": "configuring", 00:13:02.853 "raid_level": "raid0", 00:13:02.853 "superblock": true, 00:13:02.853 "num_base_bdevs": 3, 00:13:02.853 "num_base_bdevs_discovered": 1, 00:13:02.853 "num_base_bdevs_operational": 3, 00:13:02.853 "base_bdevs_list": [ 00:13:02.853 { 00:13:02.853 "name": "pt1", 00:13:02.853 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:02.853 "is_configured": true, 00:13:02.853 "data_offset": 2048, 00:13:02.853 "data_size": 63488 00:13:02.853 }, 00:13:02.853 { 00:13:02.853 "name": null, 00:13:02.853 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:02.853 "is_configured": false, 00:13:02.853 "data_offset": 2048, 00:13:02.853 "data_size": 63488 00:13:02.853 }, 00:13:02.853 { 00:13:02.853 "name": null, 00:13:02.853 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:02.853 "is_configured": false, 00:13:02.853 "data_offset": 2048, 00:13:02.854 "data_size": 63488 00:13:02.854 } 00:13:02.854 ] 00:13:02.854 }' 00:13:02.854 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:02.854 15:52:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:03.789 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:13:03.789 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:03.789 15:52:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:13:03.789 [2024-06-10 15:52:09.180082] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:13:03.789 [2024-06-10 15:52:09.180129] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:03.789 [2024-06-10 15:52:09.180145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1022a20 00:13:03.789 [2024-06-10 15:52:09.180155] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:03.789 [2024-06-10 15:52:09.180504] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:03.789 [2024-06-10 15:52:09.180520] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:13:03.789 [2024-06-10 15:52:09.180577] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:13:03.789 [2024-06-10 15:52:09.180595] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:13:03.789 pt2 00:13:03.789 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:03.789 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:03.789 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:13:04.046 [2024-06-10 15:52:09.436782] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:13:04.046 [2024-06-10 15:52:09.436818] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:04.047 [2024-06-10 15:52:09.436831] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1020d30 00:13:04.047 [2024-06-10 15:52:09.436840] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:04.047 [2024-06-10 15:52:09.437152] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:04.047 [2024-06-10 15:52:09.437169] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:13:04.047 [2024-06-10 15:52:09.437217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:13:04.047 [2024-06-10 15:52:09.437233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:13:04.047 [2024-06-10 15:52:09.437339] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x101f560 00:13:04.047 [2024-06-10 15:52:09.437348] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:04.047 [2024-06-10 15:52:09.437520] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1022470 00:13:04.047 [2024-06-10 15:52:09.437649] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x101f560 00:13:04.047 [2024-06-10 15:52:09.437657] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x101f560 00:13:04.047 [2024-06-10 15:52:09.437754] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:04.047 pt3 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:04.047 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:04.304 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:04.304 "name": "raid_bdev1", 00:13:04.304 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:13:04.304 "strip_size_kb": 64, 00:13:04.304 "state": "online", 00:13:04.304 "raid_level": "raid0", 00:13:04.304 "superblock": true, 00:13:04.304 "num_base_bdevs": 3, 00:13:04.304 "num_base_bdevs_discovered": 3, 00:13:04.304 "num_base_bdevs_operational": 3, 00:13:04.304 "base_bdevs_list": [ 00:13:04.304 { 00:13:04.304 "name": "pt1", 00:13:04.304 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:04.304 "is_configured": true, 00:13:04.304 "data_offset": 2048, 00:13:04.304 "data_size": 63488 00:13:04.304 }, 00:13:04.304 { 00:13:04.304 "name": "pt2", 00:13:04.304 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:04.304 "is_configured": true, 00:13:04.304 "data_offset": 2048, 00:13:04.304 "data_size": 63488 00:13:04.304 }, 00:13:04.304 { 00:13:04.304 "name": "pt3", 00:13:04.304 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:04.304 "is_configured": true, 00:13:04.304 "data_offset": 2048, 00:13:04.304 "data_size": 63488 00:13:04.304 } 00:13:04.304 ] 00:13:04.304 }' 00:13:04.304 15:52:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:04.304 15:52:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:04.870 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:05.128 [2024-06-10 15:52:10.507914] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:05.128 "name": "raid_bdev1", 00:13:05.128 "aliases": [ 00:13:05.128 "95eec627-628c-43ed-a5f1-fe7a51e5caa2" 00:13:05.128 ], 00:13:05.128 "product_name": "Raid Volume", 00:13:05.128 "block_size": 512, 00:13:05.128 "num_blocks": 190464, 00:13:05.128 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:13:05.128 "assigned_rate_limits": { 00:13:05.128 "rw_ios_per_sec": 0, 00:13:05.128 "rw_mbytes_per_sec": 0, 00:13:05.128 "r_mbytes_per_sec": 0, 00:13:05.128 "w_mbytes_per_sec": 0 00:13:05.128 }, 00:13:05.128 "claimed": false, 00:13:05.128 "zoned": false, 00:13:05.128 "supported_io_types": { 00:13:05.128 "read": true, 00:13:05.128 "write": true, 00:13:05.128 "unmap": true, 00:13:05.128 "write_zeroes": true, 00:13:05.128 "flush": true, 00:13:05.128 "reset": true, 00:13:05.128 "compare": false, 00:13:05.128 "compare_and_write": false, 00:13:05.128 "abort": false, 00:13:05.128 "nvme_admin": false, 00:13:05.128 "nvme_io": false 00:13:05.128 }, 00:13:05.128 "memory_domains": [ 00:13:05.128 { 00:13:05.128 "dma_device_id": "system", 00:13:05.128 "dma_device_type": 1 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.128 "dma_device_type": 2 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "dma_device_id": "system", 00:13:05.128 "dma_device_type": 1 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.128 "dma_device_type": 2 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "dma_device_id": "system", 00:13:05.128 "dma_device_type": 1 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.128 "dma_device_type": 2 00:13:05.128 } 00:13:05.128 ], 00:13:05.128 "driver_specific": { 00:13:05.128 "raid": { 00:13:05.128 "uuid": "95eec627-628c-43ed-a5f1-fe7a51e5caa2", 00:13:05.128 "strip_size_kb": 64, 00:13:05.128 "state": "online", 00:13:05.128 "raid_level": "raid0", 00:13:05.128 "superblock": true, 00:13:05.128 "num_base_bdevs": 3, 00:13:05.128 "num_base_bdevs_discovered": 3, 00:13:05.128 "num_base_bdevs_operational": 3, 00:13:05.128 "base_bdevs_list": [ 00:13:05.128 { 00:13:05.128 "name": "pt1", 00:13:05.128 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:05.128 "is_configured": true, 00:13:05.128 "data_offset": 2048, 00:13:05.128 "data_size": 63488 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "name": "pt2", 00:13:05.128 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:05.128 "is_configured": true, 00:13:05.128 "data_offset": 2048, 00:13:05.128 "data_size": 63488 00:13:05.128 }, 00:13:05.128 { 00:13:05.128 "name": "pt3", 00:13:05.128 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:05.128 "is_configured": true, 00:13:05.128 "data_offset": 2048, 00:13:05.128 "data_size": 63488 00:13:05.128 } 00:13:05.128 ] 00:13:05.128 } 00:13:05.128 } 00:13:05.128 }' 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:13:05.128 pt2 00:13:05.128 pt3' 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:13:05.128 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:05.386 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:05.386 "name": "pt1", 00:13:05.386 "aliases": [ 00:13:05.386 "00000000-0000-0000-0000-000000000001" 00:13:05.386 ], 00:13:05.386 "product_name": "passthru", 00:13:05.386 "block_size": 512, 00:13:05.386 "num_blocks": 65536, 00:13:05.386 "uuid": "00000000-0000-0000-0000-000000000001", 00:13:05.386 "assigned_rate_limits": { 00:13:05.386 "rw_ios_per_sec": 0, 00:13:05.386 "rw_mbytes_per_sec": 0, 00:13:05.386 "r_mbytes_per_sec": 0, 00:13:05.386 "w_mbytes_per_sec": 0 00:13:05.386 }, 00:13:05.386 "claimed": true, 00:13:05.386 "claim_type": "exclusive_write", 00:13:05.386 "zoned": false, 00:13:05.386 "supported_io_types": { 00:13:05.386 "read": true, 00:13:05.386 "write": true, 00:13:05.386 "unmap": true, 00:13:05.386 "write_zeroes": true, 00:13:05.386 "flush": true, 00:13:05.386 "reset": true, 00:13:05.386 "compare": false, 00:13:05.386 "compare_and_write": false, 00:13:05.386 "abort": true, 00:13:05.386 "nvme_admin": false, 00:13:05.386 "nvme_io": false 00:13:05.386 }, 00:13:05.386 "memory_domains": [ 00:13:05.386 { 00:13:05.386 "dma_device_id": "system", 00:13:05.386 "dma_device_type": 1 00:13:05.386 }, 00:13:05.386 { 00:13:05.386 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:05.386 "dma_device_type": 2 00:13:05.386 } 00:13:05.386 ], 00:13:05.386 "driver_specific": { 00:13:05.386 "passthru": { 00:13:05.386 "name": "pt1", 00:13:05.386 "base_bdev_name": "malloc1" 00:13:05.386 } 00:13:05.386 } 00:13:05.386 }' 00:13:05.386 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.386 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:05.644 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:05.644 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.644 15:52:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.644 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:05.902 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:05.902 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:05.903 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:13:05.903 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.161 "name": "pt2", 00:13:06.161 "aliases": [ 00:13:06.161 "00000000-0000-0000-0000-000000000002" 00:13:06.161 ], 00:13:06.161 "product_name": "passthru", 00:13:06.161 "block_size": 512, 00:13:06.161 "num_blocks": 65536, 00:13:06.161 "uuid": "00000000-0000-0000-0000-000000000002", 00:13:06.161 "assigned_rate_limits": { 00:13:06.161 "rw_ios_per_sec": 0, 00:13:06.161 "rw_mbytes_per_sec": 0, 00:13:06.161 "r_mbytes_per_sec": 0, 00:13:06.161 "w_mbytes_per_sec": 0 00:13:06.161 }, 00:13:06.161 "claimed": true, 00:13:06.161 "claim_type": "exclusive_write", 00:13:06.161 "zoned": false, 00:13:06.161 "supported_io_types": { 00:13:06.161 "read": true, 00:13:06.161 "write": true, 00:13:06.161 "unmap": true, 00:13:06.161 "write_zeroes": true, 00:13:06.161 "flush": true, 00:13:06.161 "reset": true, 00:13:06.161 "compare": false, 00:13:06.161 "compare_and_write": false, 00:13:06.161 "abort": true, 00:13:06.161 "nvme_admin": false, 00:13:06.161 "nvme_io": false 00:13:06.161 }, 00:13:06.161 "memory_domains": [ 00:13:06.161 { 00:13:06.161 "dma_device_id": "system", 00:13:06.161 "dma_device_type": 1 00:13:06.161 }, 00:13:06.161 { 00:13:06.161 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.161 "dma_device_type": 2 00:13:06.161 } 00:13:06.161 ], 00:13:06.161 "driver_specific": { 00:13:06.161 "passthru": { 00:13:06.161 "name": "pt2", 00:13:06.161 "base_bdev_name": "malloc2" 00:13:06.161 } 00:13:06.161 } 00:13:06.161 }' 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.161 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:13:06.420 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:06.678 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:06.678 "name": "pt3", 00:13:06.678 "aliases": [ 00:13:06.678 "00000000-0000-0000-0000-000000000003" 00:13:06.678 ], 00:13:06.678 "product_name": "passthru", 00:13:06.678 "block_size": 512, 00:13:06.678 "num_blocks": 65536, 00:13:06.678 "uuid": "00000000-0000-0000-0000-000000000003", 00:13:06.678 "assigned_rate_limits": { 00:13:06.678 "rw_ios_per_sec": 0, 00:13:06.678 "rw_mbytes_per_sec": 0, 00:13:06.678 "r_mbytes_per_sec": 0, 00:13:06.678 "w_mbytes_per_sec": 0 00:13:06.678 }, 00:13:06.678 "claimed": true, 00:13:06.678 "claim_type": "exclusive_write", 00:13:06.678 "zoned": false, 00:13:06.678 "supported_io_types": { 00:13:06.678 "read": true, 00:13:06.678 "write": true, 00:13:06.678 "unmap": true, 00:13:06.678 "write_zeroes": true, 00:13:06.678 "flush": true, 00:13:06.678 "reset": true, 00:13:06.678 "compare": false, 00:13:06.679 "compare_and_write": false, 00:13:06.679 "abort": true, 00:13:06.679 "nvme_admin": false, 00:13:06.679 "nvme_io": false 00:13:06.679 }, 00:13:06.679 "memory_domains": [ 00:13:06.679 { 00:13:06.679 "dma_device_id": "system", 00:13:06.679 "dma_device_type": 1 00:13:06.679 }, 00:13:06.679 { 00:13:06.679 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:06.679 "dma_device_type": 2 00:13:06.679 } 00:13:06.679 ], 00:13:06.679 "driver_specific": { 00:13:06.679 "passthru": { 00:13:06.679 "name": "pt3", 00:13:06.679 "base_bdev_name": "malloc3" 00:13:06.679 } 00:13:06.679 } 00:13:06.679 }' 00:13:06.679 15:52:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:06.679 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:13:06.937 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:13:07.197 [2024-06-10 15:52:12.589529] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 95eec627-628c-43ed-a5f1-fe7a51e5caa2 '!=' 95eec627-628c-43ed-a5f1-fe7a51e5caa2 ']' 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2667793 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2667793 ']' 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2667793 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2667793 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2667793' 00:13:07.197 killing process with pid 2667793 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2667793 00:13:07.197 [2024-06-10 15:52:12.657951] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:07.197 [2024-06-10 15:52:12.658018] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:07.197 [2024-06-10 15:52:12.658075] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:07.197 [2024-06-10 15:52:12.658084] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x101f560 name raid_bdev1, state offline 00:13:07.197 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2667793 00:13:07.197 [2024-06-10 15:52:12.682991] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:07.457 15:52:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:13:07.457 00:13:07.457 real 0m14.541s 00:13:07.457 user 0m26.761s 00:13:07.457 sys 0m2.107s 00:13:07.457 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:07.457 15:52:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.457 ************************************ 00:13:07.457 END TEST raid_superblock_test 00:13:07.457 ************************************ 00:13:07.457 15:52:12 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 3 read 00:13:07.457 15:52:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:07.457 15:52:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:07.457 15:52:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:07.457 ************************************ 00:13:07.457 START TEST raid_read_error_test 00:13:07.457 ************************************ 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 read 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.HluC6Fd28X 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2670588 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2670588 /var/tmp/spdk-raid.sock 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2670588 ']' 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:07.457 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:07.458 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:07.458 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:07.458 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:07.458 15:52:12 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:07.716 [2024-06-10 15:52:13.019844] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:13:07.716 [2024-06-10 15:52:13.019899] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2670588 ] 00:13:07.716 [2024-06-10 15:52:13.115951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:07.716 [2024-06-10 15:52:13.210494] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:07.975 [2024-06-10 15:52:13.268224] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:07.975 [2024-06-10 15:52:13.268260] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:08.543 15:52:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:08.544 15:52:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:08.544 15:52:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:08.544 15:52:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:08.802 BaseBdev1_malloc 00:13:08.802 15:52:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:09.060 true 00:13:09.060 15:52:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:09.319 [2024-06-10 15:52:14.726760] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:09.319 [2024-06-10 15:52:14.726800] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:09.319 [2024-06-10 15:52:14.726816] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac4150 00:13:09.319 [2024-06-10 15:52:14.726826] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:09.319 [2024-06-10 15:52:14.728641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:09.319 [2024-06-10 15:52:14.728670] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:09.319 BaseBdev1 00:13:09.319 15:52:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:09.319 15:52:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:09.578 BaseBdev2_malloc 00:13:09.578 15:52:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:09.837 true 00:13:09.837 15:52:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:10.095 [2024-06-10 15:52:15.493412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:10.095 [2024-06-10 15:52:15.493450] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:10.095 [2024-06-10 15:52:15.493467] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac8b50 00:13:10.095 [2024-06-10 15:52:15.493476] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:10.095 [2024-06-10 15:52:15.495057] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:10.095 [2024-06-10 15:52:15.495085] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:10.095 BaseBdev2 00:13:10.095 15:52:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:10.096 15:52:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:10.354 BaseBdev3_malloc 00:13:10.354 15:52:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:10.613 true 00:13:10.613 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:10.871 [2024-06-10 15:52:16.263887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:10.871 [2024-06-10 15:52:16.263927] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:10.871 [2024-06-10 15:52:16.263943] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ac9780 00:13:10.872 [2024-06-10 15:52:16.263952] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:10.872 [2024-06-10 15:52:16.265539] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:10.872 [2024-06-10 15:52:16.265566] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:10.872 BaseBdev3 00:13:10.872 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:11.130 [2024-06-10 15:52:16.508559] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:11.130 [2024-06-10 15:52:16.509859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:11.130 [2024-06-10 15:52:16.509930] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:11.130 [2024-06-10 15:52:16.510143] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1acc5a0 00:13:11.130 [2024-06-10 15:52:16.510154] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:11.130 [2024-06-10 15:52:16.510345] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ac5b20 00:13:11.130 [2024-06-10 15:52:16.510498] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1acc5a0 00:13:11.131 [2024-06-10 15:52:16.510507] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1acc5a0 00:13:11.131 [2024-06-10 15:52:16.510610] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:11.131 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:11.390 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:11.390 "name": "raid_bdev1", 00:13:11.390 "uuid": "372c11f3-a41f-4960-b34a-bab2c2d2cc0a", 00:13:11.390 "strip_size_kb": 64, 00:13:11.390 "state": "online", 00:13:11.390 "raid_level": "raid0", 00:13:11.390 "superblock": true, 00:13:11.390 "num_base_bdevs": 3, 00:13:11.390 "num_base_bdevs_discovered": 3, 00:13:11.390 "num_base_bdevs_operational": 3, 00:13:11.390 "base_bdevs_list": [ 00:13:11.390 { 00:13:11.390 "name": "BaseBdev1", 00:13:11.390 "uuid": "814f1209-ed1a-5325-b251-c89af31f475f", 00:13:11.390 "is_configured": true, 00:13:11.390 "data_offset": 2048, 00:13:11.390 "data_size": 63488 00:13:11.390 }, 00:13:11.390 { 00:13:11.390 "name": "BaseBdev2", 00:13:11.390 "uuid": "0c55888e-6106-5fad-ae6e-5166468f8504", 00:13:11.390 "is_configured": true, 00:13:11.390 "data_offset": 2048, 00:13:11.390 "data_size": 63488 00:13:11.390 }, 00:13:11.390 { 00:13:11.390 "name": "BaseBdev3", 00:13:11.390 "uuid": "aa3c6180-a07b-5d02-b75c-536b64cbf556", 00:13:11.390 "is_configured": true, 00:13:11.390 "data_offset": 2048, 00:13:11.390 "data_size": 63488 00:13:11.390 } 00:13:11.390 ] 00:13:11.390 }' 00:13:11.390 15:52:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:11.390 15:52:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:11.958 15:52:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:11.958 15:52:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:11.958 [2024-06-10 15:52:17.447314] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1621d50 00:13:12.894 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:13.186 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:13.445 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:13.445 "name": "raid_bdev1", 00:13:13.445 "uuid": "372c11f3-a41f-4960-b34a-bab2c2d2cc0a", 00:13:13.445 "strip_size_kb": 64, 00:13:13.445 "state": "online", 00:13:13.445 "raid_level": "raid0", 00:13:13.445 "superblock": true, 00:13:13.445 "num_base_bdevs": 3, 00:13:13.445 "num_base_bdevs_discovered": 3, 00:13:13.445 "num_base_bdevs_operational": 3, 00:13:13.445 "base_bdevs_list": [ 00:13:13.445 { 00:13:13.445 "name": "BaseBdev1", 00:13:13.445 "uuid": "814f1209-ed1a-5325-b251-c89af31f475f", 00:13:13.445 "is_configured": true, 00:13:13.445 "data_offset": 2048, 00:13:13.445 "data_size": 63488 00:13:13.445 }, 00:13:13.445 { 00:13:13.445 "name": "BaseBdev2", 00:13:13.445 "uuid": "0c55888e-6106-5fad-ae6e-5166468f8504", 00:13:13.445 "is_configured": true, 00:13:13.445 "data_offset": 2048, 00:13:13.445 "data_size": 63488 00:13:13.445 }, 00:13:13.445 { 00:13:13.445 "name": "BaseBdev3", 00:13:13.445 "uuid": "aa3c6180-a07b-5d02-b75c-536b64cbf556", 00:13:13.445 "is_configured": true, 00:13:13.445 "data_offset": 2048, 00:13:13.445 "data_size": 63488 00:13:13.445 } 00:13:13.445 ] 00:13:13.445 }' 00:13:13.445 15:52:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:13.445 15:52:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.011 15:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:14.270 [2024-06-10 15:52:19.729873] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:14.270 [2024-06-10 15:52:19.729901] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:14.270 [2024-06-10 15:52:19.733308] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:14.270 [2024-06-10 15:52:19.733343] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:14.270 [2024-06-10 15:52:19.733378] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:14.270 [2024-06-10 15:52:19.733392] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1acc5a0 name raid_bdev1, state offline 00:13:14.270 0 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2670588 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2670588 ']' 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2670588 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:14.270 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2670588 00:13:14.529 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:14.529 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:14.529 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2670588' 00:13:14.529 killing process with pid 2670588 00:13:14.529 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2670588 00:13:14.529 [2024-06-10 15:52:19.799768] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:14.529 15:52:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2670588 00:13:14.529 [2024-06-10 15:52:19.819310] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.HluC6Fd28X 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:13:14.529 00:13:14.529 real 0m7.086s 00:13:14.529 user 0m11.516s 00:13:14.529 sys 0m1.011s 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:14.529 15:52:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.529 ************************************ 00:13:14.529 END TEST raid_read_error_test 00:13:14.529 ************************************ 00:13:14.789 15:52:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 3 write 00:13:14.789 15:52:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:14.789 15:52:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:14.789 15:52:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:14.789 ************************************ 00:13:14.789 START TEST raid_write_error_test 00:13:14.789 ************************************ 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 3 write 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.08GNZiZsaJ 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2671827 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2671827 /var/tmp/spdk-raid.sock 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2671827 ']' 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:14.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:14.789 15:52:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:14.789 [2024-06-10 15:52:20.170503] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:13:14.789 [2024-06-10 15:52:20.170557] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2671827 ] 00:13:14.789 [2024-06-10 15:52:20.259118] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:15.048 [2024-06-10 15:52:20.352747] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:15.048 [2024-06-10 15:52:20.409565] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:15.048 [2024-06-10 15:52:20.409595] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:15.984 15:52:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:15.984 15:52:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:13:15.984 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:15.984 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:13:15.984 BaseBdev1_malloc 00:13:15.984 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:13:16.243 true 00:13:16.243 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:13:16.502 [2024-06-10 15:52:21.883289] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:13:16.502 [2024-06-10 15:52:21.883331] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:16.502 [2024-06-10 15:52:21.883351] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc78150 00:13:16.502 [2024-06-10 15:52:21.883361] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:16.502 [2024-06-10 15:52:21.885189] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:16.502 [2024-06-10 15:52:21.885218] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:13:16.502 BaseBdev1 00:13:16.502 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:16.502 15:52:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:13:16.761 BaseBdev2_malloc 00:13:16.761 15:52:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:13:17.019 true 00:13:17.019 15:52:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:13:17.278 [2024-06-10 15:52:22.641916] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:13:17.278 [2024-06-10 15:52:22.641962] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:17.278 [2024-06-10 15:52:22.641981] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc7cb50 00:13:17.278 [2024-06-10 15:52:22.641991] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:17.278 [2024-06-10 15:52:22.643542] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:17.278 [2024-06-10 15:52:22.643572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:13:17.278 BaseBdev2 00:13:17.279 15:52:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:13:17.279 15:52:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:13:17.538 BaseBdev3_malloc 00:13:17.538 15:52:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:13:17.796 true 00:13:17.796 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:13:18.055 [2024-06-10 15:52:23.396470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:13:18.055 [2024-06-10 15:52:23.396509] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:13:18.055 [2024-06-10 15:52:23.396528] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc7d780 00:13:18.055 [2024-06-10 15:52:23.396537] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:13:18.055 [2024-06-10 15:52:23.398117] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:13:18.055 [2024-06-10 15:52:23.398144] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:13:18.055 BaseBdev3 00:13:18.055 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:13:18.314 [2024-06-10 15:52:23.641148] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:18.314 [2024-06-10 15:52:23.642461] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:18.314 [2024-06-10 15:52:23.642533] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:18.314 [2024-06-10 15:52:23.642739] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc805a0 00:13:18.314 [2024-06-10 15:52:23.642749] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:18.314 [2024-06-10 15:52:23.642943] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc79b20 00:13:18.314 [2024-06-10 15:52:23.643110] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc805a0 00:13:18.314 [2024-06-10 15:52:23.643119] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc805a0 00:13:18.314 [2024-06-10 15:52:23.643222] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:18.314 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:18.573 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:18.573 "name": "raid_bdev1", 00:13:18.573 "uuid": "9d655071-a63f-4a80-b79f-9a25cf1060b3", 00:13:18.573 "strip_size_kb": 64, 00:13:18.573 "state": "online", 00:13:18.573 "raid_level": "raid0", 00:13:18.573 "superblock": true, 00:13:18.573 "num_base_bdevs": 3, 00:13:18.573 "num_base_bdevs_discovered": 3, 00:13:18.573 "num_base_bdevs_operational": 3, 00:13:18.573 "base_bdevs_list": [ 00:13:18.573 { 00:13:18.573 "name": "BaseBdev1", 00:13:18.573 "uuid": "912f73e6-8579-5f19-befe-fc1ffe835dbe", 00:13:18.573 "is_configured": true, 00:13:18.573 "data_offset": 2048, 00:13:18.573 "data_size": 63488 00:13:18.573 }, 00:13:18.573 { 00:13:18.573 "name": "BaseBdev2", 00:13:18.573 "uuid": "7158a01e-549a-5b27-8738-2fac4e502a09", 00:13:18.573 "is_configured": true, 00:13:18.573 "data_offset": 2048, 00:13:18.573 "data_size": 63488 00:13:18.573 }, 00:13:18.573 { 00:13:18.573 "name": "BaseBdev3", 00:13:18.573 "uuid": "08701d3c-c9b7-54ac-b395-b9318105f80d", 00:13:18.573 "is_configured": true, 00:13:18.573 "data_offset": 2048, 00:13:18.573 "data_size": 63488 00:13:18.573 } 00:13:18.573 ] 00:13:18.573 }' 00:13:18.573 15:52:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:18.573 15:52:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:19.141 15:52:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:13:19.141 15:52:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:13:19.141 [2024-06-10 15:52:24.543973] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x7d5d50 00:13:20.078 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 3 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:20.337 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:13:20.596 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:20.596 "name": "raid_bdev1", 00:13:20.596 "uuid": "9d655071-a63f-4a80-b79f-9a25cf1060b3", 00:13:20.596 "strip_size_kb": 64, 00:13:20.596 "state": "online", 00:13:20.596 "raid_level": "raid0", 00:13:20.596 "superblock": true, 00:13:20.596 "num_base_bdevs": 3, 00:13:20.596 "num_base_bdevs_discovered": 3, 00:13:20.597 "num_base_bdevs_operational": 3, 00:13:20.597 "base_bdevs_list": [ 00:13:20.597 { 00:13:20.597 "name": "BaseBdev1", 00:13:20.597 "uuid": "912f73e6-8579-5f19-befe-fc1ffe835dbe", 00:13:20.597 "is_configured": true, 00:13:20.597 "data_offset": 2048, 00:13:20.597 "data_size": 63488 00:13:20.597 }, 00:13:20.597 { 00:13:20.597 "name": "BaseBdev2", 00:13:20.597 "uuid": "7158a01e-549a-5b27-8738-2fac4e502a09", 00:13:20.597 "is_configured": true, 00:13:20.597 "data_offset": 2048, 00:13:20.597 "data_size": 63488 00:13:20.597 }, 00:13:20.597 { 00:13:20.597 "name": "BaseBdev3", 00:13:20.597 "uuid": "08701d3c-c9b7-54ac-b395-b9318105f80d", 00:13:20.597 "is_configured": true, 00:13:20.597 "data_offset": 2048, 00:13:20.597 "data_size": 63488 00:13:20.597 } 00:13:20.597 ] 00:13:20.597 }' 00:13:20.597 15:52:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:20.597 15:52:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.165 15:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:13:21.424 [2024-06-10 15:52:26.730417] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:13:21.424 [2024-06-10 15:52:26.730449] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:21.424 [2024-06-10 15:52:26.733852] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:21.425 [2024-06-10 15:52:26.733887] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:21.425 [2024-06-10 15:52:26.733924] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:21.425 [2024-06-10 15:52:26.733932] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc805a0 name raid_bdev1, state offline 00:13:21.425 0 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2671827 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2671827 ']' 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2671827 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2671827 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2671827' 00:13:21.425 killing process with pid 2671827 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2671827 00:13:21.425 [2024-06-10 15:52:26.792174] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:21.425 15:52:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2671827 00:13:21.425 [2024-06-10 15:52:26.811126] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:21.684 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.08GNZiZsaJ 00:13:21.684 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:13:21.684 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:13:21.684 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:13:21.685 00:13:21.685 real 0m6.921s 00:13:21.685 user 0m11.232s 00:13:21.685 sys 0m0.950s 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:21.685 15:52:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.685 ************************************ 00:13:21.685 END TEST raid_write_error_test 00:13:21.685 ************************************ 00:13:21.685 15:52:27 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:13:21.685 15:52:27 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 3 false 00:13:21.685 15:52:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:21.685 15:52:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:21.685 15:52:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:21.685 ************************************ 00:13:21.685 START TEST raid_state_function_test 00:13:21.685 ************************************ 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 false 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2673060 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2673060' 00:13:21.685 Process raid pid: 2673060 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2673060 /var/tmp/spdk-raid.sock 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2673060 ']' 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:21.685 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:21.685 15:52:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:21.685 [2024-06-10 15:52:27.154635] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:13:21.685 [2024-06-10 15:52:27.154690] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:21.944 [2024-06-10 15:52:27.253980] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:21.944 [2024-06-10 15:52:27.348708] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:21.944 [2024-06-10 15:52:27.407580] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:21.944 [2024-06-10 15:52:27.407612] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:22.882 [2024-06-10 15:52:28.342705] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:22.882 [2024-06-10 15:52:28.342744] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:22.882 [2024-06-10 15:52:28.342753] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:22.882 [2024-06-10 15:52:28.342762] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:22.882 [2024-06-10 15:52:28.342769] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:22.882 [2024-06-10 15:52:28.342778] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:22.882 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:23.141 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:23.141 "name": "Existed_Raid", 00:13:23.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.141 "strip_size_kb": 64, 00:13:23.141 "state": "configuring", 00:13:23.141 "raid_level": "concat", 00:13:23.141 "superblock": false, 00:13:23.141 "num_base_bdevs": 3, 00:13:23.141 "num_base_bdevs_discovered": 0, 00:13:23.141 "num_base_bdevs_operational": 3, 00:13:23.141 "base_bdevs_list": [ 00:13:23.141 { 00:13:23.141 "name": "BaseBdev1", 00:13:23.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.141 "is_configured": false, 00:13:23.141 "data_offset": 0, 00:13:23.141 "data_size": 0 00:13:23.141 }, 00:13:23.141 { 00:13:23.141 "name": "BaseBdev2", 00:13:23.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.141 "is_configured": false, 00:13:23.141 "data_offset": 0, 00:13:23.141 "data_size": 0 00:13:23.141 }, 00:13:23.141 { 00:13:23.141 "name": "BaseBdev3", 00:13:23.141 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:23.141 "is_configured": false, 00:13:23.141 "data_offset": 0, 00:13:23.141 "data_size": 0 00:13:23.141 } 00:13:23.141 ] 00:13:23.141 }' 00:13:23.141 15:52:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:23.141 15:52:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:23.709 15:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:23.969 [2024-06-10 15:52:29.437491] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:23.970 [2024-06-10 15:52:29.437519] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x888120 name Existed_Raid, state configuring 00:13:23.970 15:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:24.229 [2024-06-10 15:52:29.698196] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:24.229 [2024-06-10 15:52:29.698223] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:24.229 [2024-06-10 15:52:29.698230] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:24.229 [2024-06-10 15:52:29.698239] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:24.229 [2024-06-10 15:52:29.698246] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:24.229 [2024-06-10 15:52:29.698254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:24.229 15:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:24.489 [2024-06-10 15:52:29.964267] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:24.489 BaseBdev1 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:24.489 15:52:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:24.748 15:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:25.007 [ 00:13:25.007 { 00:13:25.007 "name": "BaseBdev1", 00:13:25.007 "aliases": [ 00:13:25.007 "bb954b4e-9013-480d-b91e-be0065a80df9" 00:13:25.007 ], 00:13:25.007 "product_name": "Malloc disk", 00:13:25.007 "block_size": 512, 00:13:25.007 "num_blocks": 65536, 00:13:25.007 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:25.007 "assigned_rate_limits": { 00:13:25.007 "rw_ios_per_sec": 0, 00:13:25.007 "rw_mbytes_per_sec": 0, 00:13:25.007 "r_mbytes_per_sec": 0, 00:13:25.007 "w_mbytes_per_sec": 0 00:13:25.007 }, 00:13:25.007 "claimed": true, 00:13:25.007 "claim_type": "exclusive_write", 00:13:25.007 "zoned": false, 00:13:25.007 "supported_io_types": { 00:13:25.007 "read": true, 00:13:25.007 "write": true, 00:13:25.007 "unmap": true, 00:13:25.007 "write_zeroes": true, 00:13:25.007 "flush": true, 00:13:25.007 "reset": true, 00:13:25.007 "compare": false, 00:13:25.007 "compare_and_write": false, 00:13:25.007 "abort": true, 00:13:25.007 "nvme_admin": false, 00:13:25.007 "nvme_io": false 00:13:25.007 }, 00:13:25.007 "memory_domains": [ 00:13:25.007 { 00:13:25.007 "dma_device_id": "system", 00:13:25.007 "dma_device_type": 1 00:13:25.007 }, 00:13:25.007 { 00:13:25.007 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:25.007 "dma_device_type": 2 00:13:25.007 } 00:13:25.008 ], 00:13:25.008 "driver_specific": {} 00:13:25.008 } 00:13:25.008 ] 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:25.008 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:25.267 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:25.267 "name": "Existed_Raid", 00:13:25.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.267 "strip_size_kb": 64, 00:13:25.267 "state": "configuring", 00:13:25.267 "raid_level": "concat", 00:13:25.267 "superblock": false, 00:13:25.267 "num_base_bdevs": 3, 00:13:25.267 "num_base_bdevs_discovered": 1, 00:13:25.267 "num_base_bdevs_operational": 3, 00:13:25.267 "base_bdevs_list": [ 00:13:25.267 { 00:13:25.267 "name": "BaseBdev1", 00:13:25.267 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:25.267 "is_configured": true, 00:13:25.267 "data_offset": 0, 00:13:25.267 "data_size": 65536 00:13:25.267 }, 00:13:25.267 { 00:13:25.267 "name": "BaseBdev2", 00:13:25.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.267 "is_configured": false, 00:13:25.267 "data_offset": 0, 00:13:25.267 "data_size": 0 00:13:25.267 }, 00:13:25.267 { 00:13:25.267 "name": "BaseBdev3", 00:13:25.267 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:25.267 "is_configured": false, 00:13:25.267 "data_offset": 0, 00:13:25.267 "data_size": 0 00:13:25.267 } 00:13:25.267 ] 00:13:25.267 }' 00:13:25.267 15:52:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:25.267 15:52:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:25.836 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:26.096 [2024-06-10 15:52:31.556514] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:26.096 [2024-06-10 15:52:31.556550] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8879b0 name Existed_Raid, state configuring 00:13:26.096 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:26.356 [2024-06-10 15:52:31.809210] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:26.356 [2024-06-10 15:52:31.810726] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:26.356 [2024-06-10 15:52:31.810756] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:26.356 [2024-06-10 15:52:31.810764] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:26.356 [2024-06-10 15:52:31.810772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:26.356 15:52:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:26.615 15:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:26.615 "name": "Existed_Raid", 00:13:26.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.615 "strip_size_kb": 64, 00:13:26.615 "state": "configuring", 00:13:26.615 "raid_level": "concat", 00:13:26.615 "superblock": false, 00:13:26.615 "num_base_bdevs": 3, 00:13:26.615 "num_base_bdevs_discovered": 1, 00:13:26.615 "num_base_bdevs_operational": 3, 00:13:26.615 "base_bdevs_list": [ 00:13:26.615 { 00:13:26.615 "name": "BaseBdev1", 00:13:26.615 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:26.615 "is_configured": true, 00:13:26.615 "data_offset": 0, 00:13:26.615 "data_size": 65536 00:13:26.615 }, 00:13:26.615 { 00:13:26.615 "name": "BaseBdev2", 00:13:26.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.615 "is_configured": false, 00:13:26.615 "data_offset": 0, 00:13:26.615 "data_size": 0 00:13:26.615 }, 00:13:26.615 { 00:13:26.615 "name": "BaseBdev3", 00:13:26.615 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:26.615 "is_configured": false, 00:13:26.615 "data_offset": 0, 00:13:26.615 "data_size": 0 00:13:26.615 } 00:13:26.615 ] 00:13:26.615 }' 00:13:26.615 15:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:26.615 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:27.550 [2024-06-10 15:52:32.975590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:27.550 BaseBdev2 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:27.550 15:52:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:27.848 15:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:28.127 [ 00:13:28.127 { 00:13:28.127 "name": "BaseBdev2", 00:13:28.127 "aliases": [ 00:13:28.127 "42a917d2-c417-47cf-84b8-46b08721046f" 00:13:28.127 ], 00:13:28.127 "product_name": "Malloc disk", 00:13:28.127 "block_size": 512, 00:13:28.127 "num_blocks": 65536, 00:13:28.127 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:28.127 "assigned_rate_limits": { 00:13:28.127 "rw_ios_per_sec": 0, 00:13:28.127 "rw_mbytes_per_sec": 0, 00:13:28.127 "r_mbytes_per_sec": 0, 00:13:28.127 "w_mbytes_per_sec": 0 00:13:28.127 }, 00:13:28.127 "claimed": true, 00:13:28.127 "claim_type": "exclusive_write", 00:13:28.127 "zoned": false, 00:13:28.127 "supported_io_types": { 00:13:28.127 "read": true, 00:13:28.127 "write": true, 00:13:28.127 "unmap": true, 00:13:28.127 "write_zeroes": true, 00:13:28.127 "flush": true, 00:13:28.127 "reset": true, 00:13:28.127 "compare": false, 00:13:28.127 "compare_and_write": false, 00:13:28.127 "abort": true, 00:13:28.127 "nvme_admin": false, 00:13:28.127 "nvme_io": false 00:13:28.127 }, 00:13:28.127 "memory_domains": [ 00:13:28.127 { 00:13:28.127 "dma_device_id": "system", 00:13:28.127 "dma_device_type": 1 00:13:28.127 }, 00:13:28.127 { 00:13:28.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:28.127 "dma_device_type": 2 00:13:28.127 } 00:13:28.127 ], 00:13:28.127 "driver_specific": {} 00:13:28.127 } 00:13:28.127 ] 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:28.127 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:28.128 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:28.386 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:28.386 "name": "Existed_Raid", 00:13:28.386 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.386 "strip_size_kb": 64, 00:13:28.386 "state": "configuring", 00:13:28.386 "raid_level": "concat", 00:13:28.386 "superblock": false, 00:13:28.386 "num_base_bdevs": 3, 00:13:28.386 "num_base_bdevs_discovered": 2, 00:13:28.386 "num_base_bdevs_operational": 3, 00:13:28.386 "base_bdevs_list": [ 00:13:28.386 { 00:13:28.386 "name": "BaseBdev1", 00:13:28.386 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:28.386 "is_configured": true, 00:13:28.386 "data_offset": 0, 00:13:28.386 "data_size": 65536 00:13:28.386 }, 00:13:28.386 { 00:13:28.386 "name": "BaseBdev2", 00:13:28.386 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:28.386 "is_configured": true, 00:13:28.386 "data_offset": 0, 00:13:28.386 "data_size": 65536 00:13:28.386 }, 00:13:28.386 { 00:13:28.386 "name": "BaseBdev3", 00:13:28.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:28.387 "is_configured": false, 00:13:28.387 "data_offset": 0, 00:13:28.387 "data_size": 0 00:13:28.387 } 00:13:28.387 ] 00:13:28.387 }' 00:13:28.387 15:52:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:28.387 15:52:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:28.953 15:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:29.211 [2024-06-10 15:52:34.543164] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:29.211 [2024-06-10 15:52:34.543197] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8888c0 00:13:29.211 [2024-06-10 15:52:34.543203] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:29.212 [2024-06-10 15:52:34.543403] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x89f830 00:13:29.212 [2024-06-10 15:52:34.543526] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8888c0 00:13:29.212 [2024-06-10 15:52:34.543534] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8888c0 00:13:29.212 [2024-06-10 15:52:34.543695] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:29.212 BaseBdev3 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:29.212 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:29.470 15:52:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:29.729 [ 00:13:29.729 { 00:13:29.729 "name": "BaseBdev3", 00:13:29.729 "aliases": [ 00:13:29.729 "055963c2-4cf0-4dab-87b1-413ee00e3d44" 00:13:29.729 ], 00:13:29.729 "product_name": "Malloc disk", 00:13:29.729 "block_size": 512, 00:13:29.729 "num_blocks": 65536, 00:13:29.729 "uuid": "055963c2-4cf0-4dab-87b1-413ee00e3d44", 00:13:29.729 "assigned_rate_limits": { 00:13:29.729 "rw_ios_per_sec": 0, 00:13:29.729 "rw_mbytes_per_sec": 0, 00:13:29.729 "r_mbytes_per_sec": 0, 00:13:29.729 "w_mbytes_per_sec": 0 00:13:29.729 }, 00:13:29.729 "claimed": true, 00:13:29.729 "claim_type": "exclusive_write", 00:13:29.729 "zoned": false, 00:13:29.729 "supported_io_types": { 00:13:29.729 "read": true, 00:13:29.729 "write": true, 00:13:29.729 "unmap": true, 00:13:29.729 "write_zeroes": true, 00:13:29.729 "flush": true, 00:13:29.729 "reset": true, 00:13:29.729 "compare": false, 00:13:29.729 "compare_and_write": false, 00:13:29.729 "abort": true, 00:13:29.729 "nvme_admin": false, 00:13:29.729 "nvme_io": false 00:13:29.729 }, 00:13:29.729 "memory_domains": [ 00:13:29.729 { 00:13:29.729 "dma_device_id": "system", 00:13:29.729 "dma_device_type": 1 00:13:29.729 }, 00:13:29.729 { 00:13:29.729 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:29.729 "dma_device_type": 2 00:13:29.729 } 00:13:29.729 ], 00:13:29.729 "driver_specific": {} 00:13:29.729 } 00:13:29.729 ] 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:29.729 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:29.988 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:29.988 "name": "Existed_Raid", 00:13:29.988 "uuid": "c6203387-25ee-4fc6-9690-cdf893a8718f", 00:13:29.988 "strip_size_kb": 64, 00:13:29.988 "state": "online", 00:13:29.988 "raid_level": "concat", 00:13:29.988 "superblock": false, 00:13:29.988 "num_base_bdevs": 3, 00:13:29.988 "num_base_bdevs_discovered": 3, 00:13:29.989 "num_base_bdevs_operational": 3, 00:13:29.989 "base_bdevs_list": [ 00:13:29.989 { 00:13:29.989 "name": "BaseBdev1", 00:13:29.989 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:29.989 "is_configured": true, 00:13:29.989 "data_offset": 0, 00:13:29.989 "data_size": 65536 00:13:29.989 }, 00:13:29.989 { 00:13:29.989 "name": "BaseBdev2", 00:13:29.989 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:29.989 "is_configured": true, 00:13:29.989 "data_offset": 0, 00:13:29.989 "data_size": 65536 00:13:29.989 }, 00:13:29.989 { 00:13:29.989 "name": "BaseBdev3", 00:13:29.989 "uuid": "055963c2-4cf0-4dab-87b1-413ee00e3d44", 00:13:29.989 "is_configured": true, 00:13:29.989 "data_offset": 0, 00:13:29.989 "data_size": 65536 00:13:29.989 } 00:13:29.989 ] 00:13:29.989 }' 00:13:29.989 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:29.989 15:52:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:30.557 15:52:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:30.816 [2024-06-10 15:52:36.183841] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:30.816 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:30.816 "name": "Existed_Raid", 00:13:30.816 "aliases": [ 00:13:30.816 "c6203387-25ee-4fc6-9690-cdf893a8718f" 00:13:30.816 ], 00:13:30.816 "product_name": "Raid Volume", 00:13:30.816 "block_size": 512, 00:13:30.816 "num_blocks": 196608, 00:13:30.816 "uuid": "c6203387-25ee-4fc6-9690-cdf893a8718f", 00:13:30.816 "assigned_rate_limits": { 00:13:30.816 "rw_ios_per_sec": 0, 00:13:30.816 "rw_mbytes_per_sec": 0, 00:13:30.816 "r_mbytes_per_sec": 0, 00:13:30.816 "w_mbytes_per_sec": 0 00:13:30.816 }, 00:13:30.816 "claimed": false, 00:13:30.816 "zoned": false, 00:13:30.816 "supported_io_types": { 00:13:30.816 "read": true, 00:13:30.816 "write": true, 00:13:30.816 "unmap": true, 00:13:30.816 "write_zeroes": true, 00:13:30.816 "flush": true, 00:13:30.816 "reset": true, 00:13:30.816 "compare": false, 00:13:30.816 "compare_and_write": false, 00:13:30.816 "abort": false, 00:13:30.816 "nvme_admin": false, 00:13:30.816 "nvme_io": false 00:13:30.816 }, 00:13:30.816 "memory_domains": [ 00:13:30.816 { 00:13:30.816 "dma_device_id": "system", 00:13:30.816 "dma_device_type": 1 00:13:30.816 }, 00:13:30.816 { 00:13:30.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.816 "dma_device_type": 2 00:13:30.816 }, 00:13:30.816 { 00:13:30.816 "dma_device_id": "system", 00:13:30.816 "dma_device_type": 1 00:13:30.816 }, 00:13:30.816 { 00:13:30.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.816 "dma_device_type": 2 00:13:30.816 }, 00:13:30.816 { 00:13:30.816 "dma_device_id": "system", 00:13:30.816 "dma_device_type": 1 00:13:30.816 }, 00:13:30.816 { 00:13:30.816 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:30.817 "dma_device_type": 2 00:13:30.817 } 00:13:30.817 ], 00:13:30.817 "driver_specific": { 00:13:30.817 "raid": { 00:13:30.817 "uuid": "c6203387-25ee-4fc6-9690-cdf893a8718f", 00:13:30.817 "strip_size_kb": 64, 00:13:30.817 "state": "online", 00:13:30.817 "raid_level": "concat", 00:13:30.817 "superblock": false, 00:13:30.817 "num_base_bdevs": 3, 00:13:30.817 "num_base_bdevs_discovered": 3, 00:13:30.817 "num_base_bdevs_operational": 3, 00:13:30.817 "base_bdevs_list": [ 00:13:30.817 { 00:13:30.817 "name": "BaseBdev1", 00:13:30.817 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:30.817 "is_configured": true, 00:13:30.817 "data_offset": 0, 00:13:30.817 "data_size": 65536 00:13:30.817 }, 00:13:30.817 { 00:13:30.817 "name": "BaseBdev2", 00:13:30.817 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:30.817 "is_configured": true, 00:13:30.817 "data_offset": 0, 00:13:30.817 "data_size": 65536 00:13:30.817 }, 00:13:30.817 { 00:13:30.817 "name": "BaseBdev3", 00:13:30.817 "uuid": "055963c2-4cf0-4dab-87b1-413ee00e3d44", 00:13:30.817 "is_configured": true, 00:13:30.817 "data_offset": 0, 00:13:30.817 "data_size": 65536 00:13:30.817 } 00:13:30.817 ] 00:13:30.817 } 00:13:30.817 } 00:13:30.817 }' 00:13:30.817 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:30.817 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:30.817 BaseBdev2 00:13:30.817 BaseBdev3' 00:13:30.817 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:30.817 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:30.817 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.077 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.077 "name": "BaseBdev1", 00:13:31.077 "aliases": [ 00:13:31.077 "bb954b4e-9013-480d-b91e-be0065a80df9" 00:13:31.077 ], 00:13:31.077 "product_name": "Malloc disk", 00:13:31.077 "block_size": 512, 00:13:31.077 "num_blocks": 65536, 00:13:31.077 "uuid": "bb954b4e-9013-480d-b91e-be0065a80df9", 00:13:31.077 "assigned_rate_limits": { 00:13:31.077 "rw_ios_per_sec": 0, 00:13:31.077 "rw_mbytes_per_sec": 0, 00:13:31.077 "r_mbytes_per_sec": 0, 00:13:31.077 "w_mbytes_per_sec": 0 00:13:31.077 }, 00:13:31.077 "claimed": true, 00:13:31.077 "claim_type": "exclusive_write", 00:13:31.077 "zoned": false, 00:13:31.077 "supported_io_types": { 00:13:31.077 "read": true, 00:13:31.077 "write": true, 00:13:31.077 "unmap": true, 00:13:31.077 "write_zeroes": true, 00:13:31.077 "flush": true, 00:13:31.077 "reset": true, 00:13:31.077 "compare": false, 00:13:31.077 "compare_and_write": false, 00:13:31.077 "abort": true, 00:13:31.077 "nvme_admin": false, 00:13:31.077 "nvme_io": false 00:13:31.077 }, 00:13:31.077 "memory_domains": [ 00:13:31.077 { 00:13:31.077 "dma_device_id": "system", 00:13:31.077 "dma_device_type": 1 00:13:31.077 }, 00:13:31.077 { 00:13:31.077 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.077 "dma_device_type": 2 00:13:31.077 } 00:13:31.077 ], 00:13:31.077 "driver_specific": {} 00:13:31.077 }' 00:13:31.077 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.077 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.337 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:31.596 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:31.596 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:31.596 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:31.596 15:52:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:31.855 "name": "BaseBdev2", 00:13:31.855 "aliases": [ 00:13:31.855 "42a917d2-c417-47cf-84b8-46b08721046f" 00:13:31.855 ], 00:13:31.855 "product_name": "Malloc disk", 00:13:31.855 "block_size": 512, 00:13:31.855 "num_blocks": 65536, 00:13:31.855 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:31.855 "assigned_rate_limits": { 00:13:31.855 "rw_ios_per_sec": 0, 00:13:31.855 "rw_mbytes_per_sec": 0, 00:13:31.855 "r_mbytes_per_sec": 0, 00:13:31.855 "w_mbytes_per_sec": 0 00:13:31.855 }, 00:13:31.855 "claimed": true, 00:13:31.855 "claim_type": "exclusive_write", 00:13:31.855 "zoned": false, 00:13:31.855 "supported_io_types": { 00:13:31.855 "read": true, 00:13:31.855 "write": true, 00:13:31.855 "unmap": true, 00:13:31.855 "write_zeroes": true, 00:13:31.855 "flush": true, 00:13:31.855 "reset": true, 00:13:31.855 "compare": false, 00:13:31.855 "compare_and_write": false, 00:13:31.855 "abort": true, 00:13:31.855 "nvme_admin": false, 00:13:31.855 "nvme_io": false 00:13:31.855 }, 00:13:31.855 "memory_domains": [ 00:13:31.855 { 00:13:31.855 "dma_device_id": "system", 00:13:31.855 "dma_device_type": 1 00:13:31.855 }, 00:13:31.855 { 00:13:31.855 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:31.855 "dma_device_type": 2 00:13:31.855 } 00:13:31.855 ], 00:13:31.855 "driver_specific": {} 00:13:31.855 }' 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:31.855 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:32.115 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:32.374 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:32.374 "name": "BaseBdev3", 00:13:32.374 "aliases": [ 00:13:32.374 "055963c2-4cf0-4dab-87b1-413ee00e3d44" 00:13:32.374 ], 00:13:32.374 "product_name": "Malloc disk", 00:13:32.374 "block_size": 512, 00:13:32.374 "num_blocks": 65536, 00:13:32.375 "uuid": "055963c2-4cf0-4dab-87b1-413ee00e3d44", 00:13:32.375 "assigned_rate_limits": { 00:13:32.375 "rw_ios_per_sec": 0, 00:13:32.375 "rw_mbytes_per_sec": 0, 00:13:32.375 "r_mbytes_per_sec": 0, 00:13:32.375 "w_mbytes_per_sec": 0 00:13:32.375 }, 00:13:32.375 "claimed": true, 00:13:32.375 "claim_type": "exclusive_write", 00:13:32.375 "zoned": false, 00:13:32.375 "supported_io_types": { 00:13:32.375 "read": true, 00:13:32.375 "write": true, 00:13:32.375 "unmap": true, 00:13:32.375 "write_zeroes": true, 00:13:32.375 "flush": true, 00:13:32.375 "reset": true, 00:13:32.375 "compare": false, 00:13:32.375 "compare_and_write": false, 00:13:32.375 "abort": true, 00:13:32.375 "nvme_admin": false, 00:13:32.375 "nvme_io": false 00:13:32.375 }, 00:13:32.375 "memory_domains": [ 00:13:32.375 { 00:13:32.375 "dma_device_id": "system", 00:13:32.375 "dma_device_type": 1 00:13:32.375 }, 00:13:32.375 { 00:13:32.375 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:32.375 "dma_device_type": 2 00:13:32.375 } 00:13:32.375 ], 00:13:32.375 "driver_specific": {} 00:13:32.375 }' 00:13:32.375 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.375 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:32.375 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:32.375 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.375 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:32.634 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:32.634 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.634 15:52:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:32.634 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:32.634 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.634 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:32.634 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:32.634 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:32.894 [2024-06-10 15:52:38.333399] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:32.894 [2024-06-10 15:52:38.333422] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:32.894 [2024-06-10 15:52:38.333462] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:32.894 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:33.154 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:33.154 "name": "Existed_Raid", 00:13:33.154 "uuid": "c6203387-25ee-4fc6-9690-cdf893a8718f", 00:13:33.154 "strip_size_kb": 64, 00:13:33.154 "state": "offline", 00:13:33.154 "raid_level": "concat", 00:13:33.154 "superblock": false, 00:13:33.154 "num_base_bdevs": 3, 00:13:33.154 "num_base_bdevs_discovered": 2, 00:13:33.154 "num_base_bdevs_operational": 2, 00:13:33.154 "base_bdevs_list": [ 00:13:33.154 { 00:13:33.154 "name": null, 00:13:33.154 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:33.154 "is_configured": false, 00:13:33.154 "data_offset": 0, 00:13:33.154 "data_size": 65536 00:13:33.154 }, 00:13:33.154 { 00:13:33.154 "name": "BaseBdev2", 00:13:33.154 "uuid": "42a917d2-c417-47cf-84b8-46b08721046f", 00:13:33.154 "is_configured": true, 00:13:33.154 "data_offset": 0, 00:13:33.154 "data_size": 65536 00:13:33.154 }, 00:13:33.154 { 00:13:33.154 "name": "BaseBdev3", 00:13:33.154 "uuid": "055963c2-4cf0-4dab-87b1-413ee00e3d44", 00:13:33.154 "is_configured": true, 00:13:33.154 "data_offset": 0, 00:13:33.154 "data_size": 65536 00:13:33.154 } 00:13:33.154 ] 00:13:33.154 }' 00:13:33.154 15:52:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:33.154 15:52:38 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:34.092 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:13:34.351 [2024-06-10 15:52:39.642074] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:34.351 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:34.351 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:34.351 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.351 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:13:34.610 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:13:34.610 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:13:34.610 15:52:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:13:34.870 [2024-06-10 15:52:40.165998] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:34.870 [2024-06-10 15:52:40.166040] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8888c0 name Existed_Raid, state offline 00:13:34.870 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:13:34.870 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:13:34.870 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:34.870 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:35.129 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:35.390 BaseBdev2 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:35.390 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:35.650 15:52:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:35.909 [ 00:13:35.909 { 00:13:35.909 "name": "BaseBdev2", 00:13:35.909 "aliases": [ 00:13:35.909 "e216ad72-123e-4698-b97d-a7b404d8ec89" 00:13:35.909 ], 00:13:35.909 "product_name": "Malloc disk", 00:13:35.909 "block_size": 512, 00:13:35.909 "num_blocks": 65536, 00:13:35.909 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:35.909 "assigned_rate_limits": { 00:13:35.909 "rw_ios_per_sec": 0, 00:13:35.909 "rw_mbytes_per_sec": 0, 00:13:35.909 "r_mbytes_per_sec": 0, 00:13:35.909 "w_mbytes_per_sec": 0 00:13:35.909 }, 00:13:35.909 "claimed": false, 00:13:35.909 "zoned": false, 00:13:35.909 "supported_io_types": { 00:13:35.909 "read": true, 00:13:35.909 "write": true, 00:13:35.909 "unmap": true, 00:13:35.909 "write_zeroes": true, 00:13:35.909 "flush": true, 00:13:35.909 "reset": true, 00:13:35.909 "compare": false, 00:13:35.909 "compare_and_write": false, 00:13:35.909 "abort": true, 00:13:35.909 "nvme_admin": false, 00:13:35.909 "nvme_io": false 00:13:35.909 }, 00:13:35.909 "memory_domains": [ 00:13:35.909 { 00:13:35.909 "dma_device_id": "system", 00:13:35.909 "dma_device_type": 1 00:13:35.909 }, 00:13:35.909 { 00:13:35.909 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:35.909 "dma_device_type": 2 00:13:35.909 } 00:13:35.909 ], 00:13:35.909 "driver_specific": {} 00:13:35.909 } 00:13:35.909 ] 00:13:35.909 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:35.909 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:35.909 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:35.909 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:36.168 BaseBdev3 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:36.168 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:36.428 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:36.687 [ 00:13:36.687 { 00:13:36.687 "name": "BaseBdev3", 00:13:36.687 "aliases": [ 00:13:36.687 "7e652c46-8c12-4e07-afa3-3b016bc3b62f" 00:13:36.687 ], 00:13:36.687 "product_name": "Malloc disk", 00:13:36.687 "block_size": 512, 00:13:36.687 "num_blocks": 65536, 00:13:36.687 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:36.687 "assigned_rate_limits": { 00:13:36.687 "rw_ios_per_sec": 0, 00:13:36.687 "rw_mbytes_per_sec": 0, 00:13:36.687 "r_mbytes_per_sec": 0, 00:13:36.687 "w_mbytes_per_sec": 0 00:13:36.687 }, 00:13:36.687 "claimed": false, 00:13:36.687 "zoned": false, 00:13:36.687 "supported_io_types": { 00:13:36.687 "read": true, 00:13:36.687 "write": true, 00:13:36.687 "unmap": true, 00:13:36.687 "write_zeroes": true, 00:13:36.687 "flush": true, 00:13:36.687 "reset": true, 00:13:36.687 "compare": false, 00:13:36.687 "compare_and_write": false, 00:13:36.687 "abort": true, 00:13:36.687 "nvme_admin": false, 00:13:36.687 "nvme_io": false 00:13:36.687 }, 00:13:36.687 "memory_domains": [ 00:13:36.687 { 00:13:36.687 "dma_device_id": "system", 00:13:36.687 "dma_device_type": 1 00:13:36.687 }, 00:13:36.687 { 00:13:36.687 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:36.687 "dma_device_type": 2 00:13:36.687 } 00:13:36.687 ], 00:13:36.687 "driver_specific": {} 00:13:36.687 } 00:13:36.687 ] 00:13:36.687 15:52:41 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:36.687 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:13:36.687 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:13:36.687 15:52:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:36.687 [2024-06-10 15:52:42.190355] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:36.687 [2024-06-10 15:52:42.190389] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:36.687 [2024-06-10 15:52:42.190405] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:36.687 [2024-06-10 15:52:42.191784] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:36.948 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:37.208 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:37.208 "name": "Existed_Raid", 00:13:37.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.208 "strip_size_kb": 64, 00:13:37.208 "state": "configuring", 00:13:37.208 "raid_level": "concat", 00:13:37.208 "superblock": false, 00:13:37.208 "num_base_bdevs": 3, 00:13:37.208 "num_base_bdevs_discovered": 2, 00:13:37.208 "num_base_bdevs_operational": 3, 00:13:37.208 "base_bdevs_list": [ 00:13:37.208 { 00:13:37.208 "name": "BaseBdev1", 00:13:37.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:37.208 "is_configured": false, 00:13:37.208 "data_offset": 0, 00:13:37.208 "data_size": 0 00:13:37.208 }, 00:13:37.208 { 00:13:37.208 "name": "BaseBdev2", 00:13:37.208 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:37.208 "is_configured": true, 00:13:37.208 "data_offset": 0, 00:13:37.208 "data_size": 65536 00:13:37.208 }, 00:13:37.208 { 00:13:37.208 "name": "BaseBdev3", 00:13:37.208 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:37.208 "is_configured": true, 00:13:37.208 "data_offset": 0, 00:13:37.208 "data_size": 65536 00:13:37.208 } 00:13:37.208 ] 00:13:37.208 }' 00:13:37.208 15:52:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:37.208 15:52:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:37.778 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:13:38.038 [2024-06-10 15:52:43.325378] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.038 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:38.299 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:38.299 "name": "Existed_Raid", 00:13:38.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.299 "strip_size_kb": 64, 00:13:38.299 "state": "configuring", 00:13:38.299 "raid_level": "concat", 00:13:38.299 "superblock": false, 00:13:38.299 "num_base_bdevs": 3, 00:13:38.299 "num_base_bdevs_discovered": 1, 00:13:38.299 "num_base_bdevs_operational": 3, 00:13:38.299 "base_bdevs_list": [ 00:13:38.299 { 00:13:38.299 "name": "BaseBdev1", 00:13:38.299 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:38.299 "is_configured": false, 00:13:38.299 "data_offset": 0, 00:13:38.299 "data_size": 0 00:13:38.299 }, 00:13:38.299 { 00:13:38.299 "name": null, 00:13:38.299 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:38.299 "is_configured": false, 00:13:38.299 "data_offset": 0, 00:13:38.299 "data_size": 65536 00:13:38.299 }, 00:13:38.299 { 00:13:38.299 "name": "BaseBdev3", 00:13:38.299 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:38.299 "is_configured": true, 00:13:38.299 "data_offset": 0, 00:13:38.299 "data_size": 65536 00:13:38.299 } 00:13:38.299 ] 00:13:38.299 }' 00:13:38.299 15:52:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:38.299 15:52:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:38.869 15:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:38.869 15:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:38.869 15:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:13:38.869 15:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:39.128 [2024-06-10 15:52:44.599987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:39.128 BaseBdev1 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:39.129 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:39.388 15:52:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:39.646 [ 00:13:39.646 { 00:13:39.646 "name": "BaseBdev1", 00:13:39.646 "aliases": [ 00:13:39.646 "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca" 00:13:39.646 ], 00:13:39.646 "product_name": "Malloc disk", 00:13:39.646 "block_size": 512, 00:13:39.646 "num_blocks": 65536, 00:13:39.646 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:39.646 "assigned_rate_limits": { 00:13:39.646 "rw_ios_per_sec": 0, 00:13:39.646 "rw_mbytes_per_sec": 0, 00:13:39.646 "r_mbytes_per_sec": 0, 00:13:39.646 "w_mbytes_per_sec": 0 00:13:39.646 }, 00:13:39.646 "claimed": true, 00:13:39.646 "claim_type": "exclusive_write", 00:13:39.646 "zoned": false, 00:13:39.646 "supported_io_types": { 00:13:39.646 "read": true, 00:13:39.646 "write": true, 00:13:39.646 "unmap": true, 00:13:39.646 "write_zeroes": true, 00:13:39.646 "flush": true, 00:13:39.646 "reset": true, 00:13:39.646 "compare": false, 00:13:39.646 "compare_and_write": false, 00:13:39.646 "abort": true, 00:13:39.646 "nvme_admin": false, 00:13:39.646 "nvme_io": false 00:13:39.646 }, 00:13:39.646 "memory_domains": [ 00:13:39.646 { 00:13:39.646 "dma_device_id": "system", 00:13:39.646 "dma_device_type": 1 00:13:39.646 }, 00:13:39.646 { 00:13:39.646 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:39.646 "dma_device_type": 2 00:13:39.646 } 00:13:39.646 ], 00:13:39.646 "driver_specific": {} 00:13:39.646 } 00:13:39.646 ] 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:39.646 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:39.905 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:39.905 "name": "Existed_Raid", 00:13:39.905 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:39.905 "strip_size_kb": 64, 00:13:39.905 "state": "configuring", 00:13:39.905 "raid_level": "concat", 00:13:39.905 "superblock": false, 00:13:39.905 "num_base_bdevs": 3, 00:13:39.905 "num_base_bdevs_discovered": 2, 00:13:39.905 "num_base_bdevs_operational": 3, 00:13:39.905 "base_bdevs_list": [ 00:13:39.905 { 00:13:39.905 "name": "BaseBdev1", 00:13:39.905 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:39.905 "is_configured": true, 00:13:39.905 "data_offset": 0, 00:13:39.905 "data_size": 65536 00:13:39.905 }, 00:13:39.905 { 00:13:39.905 "name": null, 00:13:39.905 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:39.905 "is_configured": false, 00:13:39.905 "data_offset": 0, 00:13:39.905 "data_size": 65536 00:13:39.905 }, 00:13:39.905 { 00:13:39.905 "name": "BaseBdev3", 00:13:39.905 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:39.905 "is_configured": true, 00:13:39.905 "data_offset": 0, 00:13:39.905 "data_size": 65536 00:13:39.905 } 00:13:39.905 ] 00:13:39.905 }' 00:13:39.905 15:52:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:39.905 15:52:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:40.839 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:40.839 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:40.839 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:13:40.839 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:13:41.098 [2024-06-10 15:52:46.497095] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.098 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:41.356 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:41.356 "name": "Existed_Raid", 00:13:41.356 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:41.356 "strip_size_kb": 64, 00:13:41.356 "state": "configuring", 00:13:41.356 "raid_level": "concat", 00:13:41.356 "superblock": false, 00:13:41.356 "num_base_bdevs": 3, 00:13:41.356 "num_base_bdevs_discovered": 1, 00:13:41.356 "num_base_bdevs_operational": 3, 00:13:41.356 "base_bdevs_list": [ 00:13:41.356 { 00:13:41.356 "name": "BaseBdev1", 00:13:41.356 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:41.356 "is_configured": true, 00:13:41.356 "data_offset": 0, 00:13:41.356 "data_size": 65536 00:13:41.356 }, 00:13:41.356 { 00:13:41.356 "name": null, 00:13:41.356 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:41.356 "is_configured": false, 00:13:41.356 "data_offset": 0, 00:13:41.356 "data_size": 65536 00:13:41.356 }, 00:13:41.356 { 00:13:41.356 "name": null, 00:13:41.356 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:41.356 "is_configured": false, 00:13:41.356 "data_offset": 0, 00:13:41.356 "data_size": 65536 00:13:41.356 } 00:13:41.356 ] 00:13:41.356 }' 00:13:41.356 15:52:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:41.356 15:52:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:41.923 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:41.923 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:42.244 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:13:42.244 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:13:42.503 [2024-06-10 15:52:47.852793] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:42.503 15:52:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:42.762 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:42.762 "name": "Existed_Raid", 00:13:42.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:42.762 "strip_size_kb": 64, 00:13:42.762 "state": "configuring", 00:13:42.762 "raid_level": "concat", 00:13:42.762 "superblock": false, 00:13:42.762 "num_base_bdevs": 3, 00:13:42.762 "num_base_bdevs_discovered": 2, 00:13:42.762 "num_base_bdevs_operational": 3, 00:13:42.762 "base_bdevs_list": [ 00:13:42.762 { 00:13:42.762 "name": "BaseBdev1", 00:13:42.762 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:42.762 "is_configured": true, 00:13:42.762 "data_offset": 0, 00:13:42.762 "data_size": 65536 00:13:42.762 }, 00:13:42.762 { 00:13:42.762 "name": null, 00:13:42.762 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:42.762 "is_configured": false, 00:13:42.762 "data_offset": 0, 00:13:42.762 "data_size": 65536 00:13:42.762 }, 00:13:42.762 { 00:13:42.762 "name": "BaseBdev3", 00:13:42.762 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:42.762 "is_configured": true, 00:13:42.762 "data_offset": 0, 00:13:42.762 "data_size": 65536 00:13:42.762 } 00:13:42.762 ] 00:13:42.762 }' 00:13:42.762 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:42.762 15:52:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:43.329 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.329 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:13:43.587 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:13:43.588 15:52:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:13:43.846 [2024-06-10 15:52:49.188383] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:43.846 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:44.105 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:44.105 "name": "Existed_Raid", 00:13:44.105 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:44.105 "strip_size_kb": 64, 00:13:44.105 "state": "configuring", 00:13:44.105 "raid_level": "concat", 00:13:44.105 "superblock": false, 00:13:44.105 "num_base_bdevs": 3, 00:13:44.105 "num_base_bdevs_discovered": 1, 00:13:44.105 "num_base_bdevs_operational": 3, 00:13:44.105 "base_bdevs_list": [ 00:13:44.105 { 00:13:44.105 "name": null, 00:13:44.105 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:44.105 "is_configured": false, 00:13:44.105 "data_offset": 0, 00:13:44.105 "data_size": 65536 00:13:44.105 }, 00:13:44.105 { 00:13:44.105 "name": null, 00:13:44.105 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:44.105 "is_configured": false, 00:13:44.105 "data_offset": 0, 00:13:44.105 "data_size": 65536 00:13:44.105 }, 00:13:44.105 { 00:13:44.105 "name": "BaseBdev3", 00:13:44.105 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:44.105 "is_configured": true, 00:13:44.105 "data_offset": 0, 00:13:44.105 "data_size": 65536 00:13:44.105 } 00:13:44.105 ] 00:13:44.105 }' 00:13:44.105 15:52:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:44.105 15:52:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:44.671 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:44.671 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:13:44.930 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:13:44.930 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:13:45.189 [2024-06-10 15:52:50.570728] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:45.189 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:45.448 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:45.448 "name": "Existed_Raid", 00:13:45.448 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:45.448 "strip_size_kb": 64, 00:13:45.448 "state": "configuring", 00:13:45.448 "raid_level": "concat", 00:13:45.448 "superblock": false, 00:13:45.448 "num_base_bdevs": 3, 00:13:45.448 "num_base_bdevs_discovered": 2, 00:13:45.448 "num_base_bdevs_operational": 3, 00:13:45.448 "base_bdevs_list": [ 00:13:45.448 { 00:13:45.448 "name": null, 00:13:45.448 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:45.448 "is_configured": false, 00:13:45.448 "data_offset": 0, 00:13:45.448 "data_size": 65536 00:13:45.448 }, 00:13:45.448 { 00:13:45.448 "name": "BaseBdev2", 00:13:45.448 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:45.448 "is_configured": true, 00:13:45.448 "data_offset": 0, 00:13:45.448 "data_size": 65536 00:13:45.448 }, 00:13:45.448 { 00:13:45.448 "name": "BaseBdev3", 00:13:45.448 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:45.448 "is_configured": true, 00:13:45.448 "data_offset": 0, 00:13:45.448 "data_size": 65536 00:13:45.448 } 00:13:45.448 ] 00:13:45.448 }' 00:13:45.448 15:52:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:45.448 15:52:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:46.015 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.015 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:13:46.274 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:13:46.274 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:46.274 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:13:46.532 15:52:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca 00:13:46.790 [2024-06-10 15:52:52.202412] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:13:46.790 [2024-06-10 15:52:52.202446] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa2c2c0 00:13:46.790 [2024-06-10 15:52:52.202452] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 196608, blocklen 512 00:13:46.790 [2024-06-10 15:52:52.202653] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x89f830 00:13:46.790 [2024-06-10 15:52:52.202773] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa2c2c0 00:13:46.790 [2024-06-10 15:52:52.202781] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa2c2c0 00:13:46.790 [2024-06-10 15:52:52.202942] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:46.790 NewBaseBdev 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:46.790 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:47.048 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:13:47.306 [ 00:13:47.306 { 00:13:47.306 "name": "NewBaseBdev", 00:13:47.306 "aliases": [ 00:13:47.306 "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca" 00:13:47.307 ], 00:13:47.307 "product_name": "Malloc disk", 00:13:47.307 "block_size": 512, 00:13:47.307 "num_blocks": 65536, 00:13:47.307 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:47.307 "assigned_rate_limits": { 00:13:47.307 "rw_ios_per_sec": 0, 00:13:47.307 "rw_mbytes_per_sec": 0, 00:13:47.307 "r_mbytes_per_sec": 0, 00:13:47.307 "w_mbytes_per_sec": 0 00:13:47.307 }, 00:13:47.307 "claimed": true, 00:13:47.307 "claim_type": "exclusive_write", 00:13:47.307 "zoned": false, 00:13:47.307 "supported_io_types": { 00:13:47.307 "read": true, 00:13:47.307 "write": true, 00:13:47.307 "unmap": true, 00:13:47.307 "write_zeroes": true, 00:13:47.307 "flush": true, 00:13:47.307 "reset": true, 00:13:47.307 "compare": false, 00:13:47.307 "compare_and_write": false, 00:13:47.307 "abort": true, 00:13:47.307 "nvme_admin": false, 00:13:47.307 "nvme_io": false 00:13:47.307 }, 00:13:47.307 "memory_domains": [ 00:13:47.307 { 00:13:47.307 "dma_device_id": "system", 00:13:47.307 "dma_device_type": 1 00:13:47.307 }, 00:13:47.307 { 00:13:47.307 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:47.307 "dma_device_type": 2 00:13:47.307 } 00:13:47.307 ], 00:13:47.307 "driver_specific": {} 00:13:47.307 } 00:13:47.307 ] 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:47.307 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:47.566 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:47.566 "name": "Existed_Raid", 00:13:47.566 "uuid": "efdf6cda-b5d2-4256-a4a5-ba345fc8eecc", 00:13:47.566 "strip_size_kb": 64, 00:13:47.566 "state": "online", 00:13:47.566 "raid_level": "concat", 00:13:47.566 "superblock": false, 00:13:47.566 "num_base_bdevs": 3, 00:13:47.566 "num_base_bdevs_discovered": 3, 00:13:47.566 "num_base_bdevs_operational": 3, 00:13:47.566 "base_bdevs_list": [ 00:13:47.566 { 00:13:47.566 "name": "NewBaseBdev", 00:13:47.566 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:47.566 "is_configured": true, 00:13:47.566 "data_offset": 0, 00:13:47.566 "data_size": 65536 00:13:47.566 }, 00:13:47.566 { 00:13:47.566 "name": "BaseBdev2", 00:13:47.566 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:47.566 "is_configured": true, 00:13:47.566 "data_offset": 0, 00:13:47.566 "data_size": 65536 00:13:47.566 }, 00:13:47.566 { 00:13:47.566 "name": "BaseBdev3", 00:13:47.566 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:47.566 "is_configured": true, 00:13:47.566 "data_offset": 0, 00:13:47.566 "data_size": 65536 00:13:47.566 } 00:13:47.566 ] 00:13:47.566 }' 00:13:47.566 15:52:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:47.566 15:52:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:48.133 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:48.391 [2024-06-10 15:52:53.698737] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:48.392 "name": "Existed_Raid", 00:13:48.392 "aliases": [ 00:13:48.392 "efdf6cda-b5d2-4256-a4a5-ba345fc8eecc" 00:13:48.392 ], 00:13:48.392 "product_name": "Raid Volume", 00:13:48.392 "block_size": 512, 00:13:48.392 "num_blocks": 196608, 00:13:48.392 "uuid": "efdf6cda-b5d2-4256-a4a5-ba345fc8eecc", 00:13:48.392 "assigned_rate_limits": { 00:13:48.392 "rw_ios_per_sec": 0, 00:13:48.392 "rw_mbytes_per_sec": 0, 00:13:48.392 "r_mbytes_per_sec": 0, 00:13:48.392 "w_mbytes_per_sec": 0 00:13:48.392 }, 00:13:48.392 "claimed": false, 00:13:48.392 "zoned": false, 00:13:48.392 "supported_io_types": { 00:13:48.392 "read": true, 00:13:48.392 "write": true, 00:13:48.392 "unmap": true, 00:13:48.392 "write_zeroes": true, 00:13:48.392 "flush": true, 00:13:48.392 "reset": true, 00:13:48.392 "compare": false, 00:13:48.392 "compare_and_write": false, 00:13:48.392 "abort": false, 00:13:48.392 "nvme_admin": false, 00:13:48.392 "nvme_io": false 00:13:48.392 }, 00:13:48.392 "memory_domains": [ 00:13:48.392 { 00:13:48.392 "dma_device_id": "system", 00:13:48.392 "dma_device_type": 1 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.392 "dma_device_type": 2 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "dma_device_id": "system", 00:13:48.392 "dma_device_type": 1 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.392 "dma_device_type": 2 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "dma_device_id": "system", 00:13:48.392 "dma_device_type": 1 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.392 "dma_device_type": 2 00:13:48.392 } 00:13:48.392 ], 00:13:48.392 "driver_specific": { 00:13:48.392 "raid": { 00:13:48.392 "uuid": "efdf6cda-b5d2-4256-a4a5-ba345fc8eecc", 00:13:48.392 "strip_size_kb": 64, 00:13:48.392 "state": "online", 00:13:48.392 "raid_level": "concat", 00:13:48.392 "superblock": false, 00:13:48.392 "num_base_bdevs": 3, 00:13:48.392 "num_base_bdevs_discovered": 3, 00:13:48.392 "num_base_bdevs_operational": 3, 00:13:48.392 "base_bdevs_list": [ 00:13:48.392 { 00:13:48.392 "name": "NewBaseBdev", 00:13:48.392 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:48.392 "is_configured": true, 00:13:48.392 "data_offset": 0, 00:13:48.392 "data_size": 65536 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "name": "BaseBdev2", 00:13:48.392 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:48.392 "is_configured": true, 00:13:48.392 "data_offset": 0, 00:13:48.392 "data_size": 65536 00:13:48.392 }, 00:13:48.392 { 00:13:48.392 "name": "BaseBdev3", 00:13:48.392 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:48.392 "is_configured": true, 00:13:48.392 "data_offset": 0, 00:13:48.392 "data_size": 65536 00:13:48.392 } 00:13:48.392 ] 00:13:48.392 } 00:13:48.392 } 00:13:48.392 }' 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:13:48.392 BaseBdev2 00:13:48.392 BaseBdev3' 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:13:48.392 15:52:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:48.650 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:48.650 "name": "NewBaseBdev", 00:13:48.650 "aliases": [ 00:13:48.650 "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca" 00:13:48.650 ], 00:13:48.650 "product_name": "Malloc disk", 00:13:48.650 "block_size": 512, 00:13:48.650 "num_blocks": 65536, 00:13:48.650 "uuid": "3df2ffe2-bd5a-4e9f-9d68-69b84e9d57ca", 00:13:48.650 "assigned_rate_limits": { 00:13:48.650 "rw_ios_per_sec": 0, 00:13:48.650 "rw_mbytes_per_sec": 0, 00:13:48.650 "r_mbytes_per_sec": 0, 00:13:48.650 "w_mbytes_per_sec": 0 00:13:48.650 }, 00:13:48.650 "claimed": true, 00:13:48.650 "claim_type": "exclusive_write", 00:13:48.650 "zoned": false, 00:13:48.650 "supported_io_types": { 00:13:48.650 "read": true, 00:13:48.650 "write": true, 00:13:48.650 "unmap": true, 00:13:48.650 "write_zeroes": true, 00:13:48.650 "flush": true, 00:13:48.650 "reset": true, 00:13:48.650 "compare": false, 00:13:48.650 "compare_and_write": false, 00:13:48.650 "abort": true, 00:13:48.650 "nvme_admin": false, 00:13:48.650 "nvme_io": false 00:13:48.650 }, 00:13:48.650 "memory_domains": [ 00:13:48.650 { 00:13:48.650 "dma_device_id": "system", 00:13:48.650 "dma_device_type": 1 00:13:48.650 }, 00:13:48.650 { 00:13:48.650 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:48.650 "dma_device_type": 2 00:13:48.650 } 00:13:48.650 ], 00:13:48.650 "driver_specific": {} 00:13:48.650 }' 00:13:48.650 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.650 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:48.650 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:48.650 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:13:48.908 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.167 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.167 "name": "BaseBdev2", 00:13:49.167 "aliases": [ 00:13:49.167 "e216ad72-123e-4698-b97d-a7b404d8ec89" 00:13:49.167 ], 00:13:49.167 "product_name": "Malloc disk", 00:13:49.167 "block_size": 512, 00:13:49.167 "num_blocks": 65536, 00:13:49.167 "uuid": "e216ad72-123e-4698-b97d-a7b404d8ec89", 00:13:49.167 "assigned_rate_limits": { 00:13:49.167 "rw_ios_per_sec": 0, 00:13:49.167 "rw_mbytes_per_sec": 0, 00:13:49.167 "r_mbytes_per_sec": 0, 00:13:49.167 "w_mbytes_per_sec": 0 00:13:49.167 }, 00:13:49.167 "claimed": true, 00:13:49.167 "claim_type": "exclusive_write", 00:13:49.167 "zoned": false, 00:13:49.167 "supported_io_types": { 00:13:49.167 "read": true, 00:13:49.167 "write": true, 00:13:49.167 "unmap": true, 00:13:49.167 "write_zeroes": true, 00:13:49.167 "flush": true, 00:13:49.167 "reset": true, 00:13:49.167 "compare": false, 00:13:49.167 "compare_and_write": false, 00:13:49.167 "abort": true, 00:13:49.167 "nvme_admin": false, 00:13:49.167 "nvme_io": false 00:13:49.167 }, 00:13:49.167 "memory_domains": [ 00:13:49.167 { 00:13:49.167 "dma_device_id": "system", 00:13:49.167 "dma_device_type": 1 00:13:49.167 }, 00:13:49.167 { 00:13:49.167 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.167 "dma_device_type": 2 00:13:49.167 } 00:13:49.167 ], 00:13:49.167 "driver_specific": {} 00:13:49.167 }' 00:13:49.168 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.168 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:49.427 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.687 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:49.687 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:49.687 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:49.687 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:13:49.687 15:52:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:13:49.946 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:13:49.946 "name": "BaseBdev3", 00:13:49.946 "aliases": [ 00:13:49.946 "7e652c46-8c12-4e07-afa3-3b016bc3b62f" 00:13:49.946 ], 00:13:49.946 "product_name": "Malloc disk", 00:13:49.946 "block_size": 512, 00:13:49.946 "num_blocks": 65536, 00:13:49.946 "uuid": "7e652c46-8c12-4e07-afa3-3b016bc3b62f", 00:13:49.946 "assigned_rate_limits": { 00:13:49.946 "rw_ios_per_sec": 0, 00:13:49.946 "rw_mbytes_per_sec": 0, 00:13:49.946 "r_mbytes_per_sec": 0, 00:13:49.946 "w_mbytes_per_sec": 0 00:13:49.946 }, 00:13:49.946 "claimed": true, 00:13:49.946 "claim_type": "exclusive_write", 00:13:49.946 "zoned": false, 00:13:49.946 "supported_io_types": { 00:13:49.946 "read": true, 00:13:49.946 "write": true, 00:13:49.946 "unmap": true, 00:13:49.946 "write_zeroes": true, 00:13:49.946 "flush": true, 00:13:49.946 "reset": true, 00:13:49.946 "compare": false, 00:13:49.946 "compare_and_write": false, 00:13:49.946 "abort": true, 00:13:49.946 "nvme_admin": false, 00:13:49.946 "nvme_io": false 00:13:49.946 }, 00:13:49.946 "memory_domains": [ 00:13:49.946 { 00:13:49.946 "dma_device_id": "system", 00:13:49.946 "dma_device_type": 1 00:13:49.947 }, 00:13:49.947 { 00:13:49.947 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:49.947 "dma_device_type": 2 00:13:49.947 } 00:13:49.947 ], 00:13:49.947 "driver_specific": {} 00:13:49.947 }' 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:13:49.947 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:13:50.205 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:50.464 [2024-06-10 15:52:55.852385] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:50.464 [2024-06-10 15:52:55.852409] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:13:50.464 [2024-06-10 15:52:55.852462] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:13:50.464 [2024-06-10 15:52:55.852512] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:13:50.464 [2024-06-10 15:52:55.852520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa2c2c0 name Existed_Raid, state offline 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2673060 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2673060 ']' 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2673060 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2673060 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2673060' 00:13:50.464 killing process with pid 2673060 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2673060 00:13:50.464 [2024-06-10 15:52:55.915763] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:13:50.464 15:52:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2673060 00:13:50.464 [2024-06-10 15:52:55.939804] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:13:50.723 00:13:50.723 real 0m29.046s 00:13:50.723 user 0m54.446s 00:13:50.723 sys 0m4.120s 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:13:50.723 ************************************ 00:13:50.723 END TEST raid_state_function_test 00:13:50.723 ************************************ 00:13:50.723 15:52:56 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 3 true 00:13:50.723 15:52:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:13:50.723 15:52:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:13:50.723 15:52:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:13:50.723 ************************************ 00:13:50.723 START TEST raid_state_function_test_sb 00:13:50.723 ************************************ 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 3 true 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:50.723 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2678437 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2678437' 00:13:50.724 Process raid pid: 2678437 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2678437 /var/tmp/spdk-raid.sock 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2678437 ']' 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:13:50.724 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:13:50.724 15:52:56 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:50.983 [2024-06-10 15:52:56.272640] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:13:50.983 [2024-06-10 15:52:56.272693] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:13:50.983 [2024-06-10 15:52:56.373388] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:13:50.983 [2024-06-10 15:52:56.467950] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:13:51.242 [2024-06-10 15:52:56.526509] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.242 [2024-06-10 15:52:56.526542] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:13:51.810 15:52:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:13:51.810 15:52:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:13:51.810 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:52.068 [2024-06-10 15:52:57.458593] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:52.068 [2024-06-10 15:52:57.458632] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:52.068 [2024-06-10 15:52:57.458641] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:52.068 [2024-06-10 15:52:57.458650] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:52.068 [2024-06-10 15:52:57.458657] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:52.068 [2024-06-10 15:52:57.458665] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:52.068 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:52.327 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:52.327 "name": "Existed_Raid", 00:13:52.327 "uuid": "8bf22677-95ee-4c20-bbed-81b12cf6a613", 00:13:52.327 "strip_size_kb": 64, 00:13:52.327 "state": "configuring", 00:13:52.327 "raid_level": "concat", 00:13:52.327 "superblock": true, 00:13:52.327 "num_base_bdevs": 3, 00:13:52.327 "num_base_bdevs_discovered": 0, 00:13:52.327 "num_base_bdevs_operational": 3, 00:13:52.327 "base_bdevs_list": [ 00:13:52.327 { 00:13:52.327 "name": "BaseBdev1", 00:13:52.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.327 "is_configured": false, 00:13:52.327 "data_offset": 0, 00:13:52.327 "data_size": 0 00:13:52.327 }, 00:13:52.327 { 00:13:52.327 "name": "BaseBdev2", 00:13:52.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.327 "is_configured": false, 00:13:52.327 "data_offset": 0, 00:13:52.327 "data_size": 0 00:13:52.327 }, 00:13:52.327 { 00:13:52.327 "name": "BaseBdev3", 00:13:52.327 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:52.327 "is_configured": false, 00:13:52.327 "data_offset": 0, 00:13:52.327 "data_size": 0 00:13:52.327 } 00:13:52.327 ] 00:13:52.327 }' 00:13:52.327 15:52:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:52.327 15:52:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:52.895 15:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:53.153 [2024-06-10 15:52:58.577414] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:53.153 [2024-06-10 15:52:58.577444] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188e120 name Existed_Raid, state configuring 00:13:53.153 15:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:53.411 [2024-06-10 15:52:58.838124] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:13:53.411 [2024-06-10 15:52:58.838150] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:13:53.411 [2024-06-10 15:52:58.838159] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:53.411 [2024-06-10 15:52:58.838167] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:53.411 [2024-06-10 15:52:58.838174] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:53.411 [2024-06-10 15:52:58.838183] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:53.411 15:52:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:13:53.669 [2024-06-10 15:52:59.100247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:53.669 BaseBdev1 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:53.669 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:53.927 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:13:54.186 [ 00:13:54.186 { 00:13:54.186 "name": "BaseBdev1", 00:13:54.186 "aliases": [ 00:13:54.186 "0b325120-bf6d-4481-af84-5cc45fd860e9" 00:13:54.186 ], 00:13:54.186 "product_name": "Malloc disk", 00:13:54.186 "block_size": 512, 00:13:54.186 "num_blocks": 65536, 00:13:54.186 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:54.186 "assigned_rate_limits": { 00:13:54.186 "rw_ios_per_sec": 0, 00:13:54.186 "rw_mbytes_per_sec": 0, 00:13:54.186 "r_mbytes_per_sec": 0, 00:13:54.186 "w_mbytes_per_sec": 0 00:13:54.186 }, 00:13:54.186 "claimed": true, 00:13:54.186 "claim_type": "exclusive_write", 00:13:54.186 "zoned": false, 00:13:54.186 "supported_io_types": { 00:13:54.186 "read": true, 00:13:54.186 "write": true, 00:13:54.186 "unmap": true, 00:13:54.186 "write_zeroes": true, 00:13:54.186 "flush": true, 00:13:54.186 "reset": true, 00:13:54.186 "compare": false, 00:13:54.186 "compare_and_write": false, 00:13:54.186 "abort": true, 00:13:54.186 "nvme_admin": false, 00:13:54.186 "nvme_io": false 00:13:54.186 }, 00:13:54.186 "memory_domains": [ 00:13:54.186 { 00:13:54.186 "dma_device_id": "system", 00:13:54.186 "dma_device_type": 1 00:13:54.186 }, 00:13:54.186 { 00:13:54.186 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:54.186 "dma_device_type": 2 00:13:54.186 } 00:13:54.186 ], 00:13:54.186 "driver_specific": {} 00:13:54.186 } 00:13:54.186 ] 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:54.186 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:54.445 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:54.445 "name": "Existed_Raid", 00:13:54.445 "uuid": "3264b5c0-e65a-44c7-ab55-1073319e3c8f", 00:13:54.445 "strip_size_kb": 64, 00:13:54.445 "state": "configuring", 00:13:54.445 "raid_level": "concat", 00:13:54.445 "superblock": true, 00:13:54.445 "num_base_bdevs": 3, 00:13:54.445 "num_base_bdevs_discovered": 1, 00:13:54.445 "num_base_bdevs_operational": 3, 00:13:54.445 "base_bdevs_list": [ 00:13:54.445 { 00:13:54.445 "name": "BaseBdev1", 00:13:54.445 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:54.445 "is_configured": true, 00:13:54.445 "data_offset": 2048, 00:13:54.445 "data_size": 63488 00:13:54.445 }, 00:13:54.445 { 00:13:54.445 "name": "BaseBdev2", 00:13:54.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.445 "is_configured": false, 00:13:54.445 "data_offset": 0, 00:13:54.445 "data_size": 0 00:13:54.445 }, 00:13:54.445 { 00:13:54.445 "name": "BaseBdev3", 00:13:54.445 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:54.445 "is_configured": false, 00:13:54.445 "data_offset": 0, 00:13:54.445 "data_size": 0 00:13:54.445 } 00:13:54.445 ] 00:13:54.445 }' 00:13:54.445 15:52:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:54.445 15:52:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:55.013 15:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:13:55.273 [2024-06-10 15:53:00.748744] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:13:55.273 [2024-06-10 15:53:00.748783] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188d9b0 name Existed_Raid, state configuring 00:13:55.273 15:53:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:13:55.532 [2024-06-10 15:53:01.005475] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:13:55.532 [2024-06-10 15:53:01.007011] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:13:55.532 [2024-06-10 15:53:01.007043] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:13:55.532 [2024-06-10 15:53:01.007052] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:13:55.532 [2024-06-10 15:53:01.007060] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:55.532 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:55.792 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:55.792 "name": "Existed_Raid", 00:13:55.792 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:13:55.792 "strip_size_kb": 64, 00:13:55.792 "state": "configuring", 00:13:55.792 "raid_level": "concat", 00:13:55.792 "superblock": true, 00:13:55.792 "num_base_bdevs": 3, 00:13:55.792 "num_base_bdevs_discovered": 1, 00:13:55.792 "num_base_bdevs_operational": 3, 00:13:55.792 "base_bdevs_list": [ 00:13:55.792 { 00:13:55.792 "name": "BaseBdev1", 00:13:55.792 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:55.792 "is_configured": true, 00:13:55.792 "data_offset": 2048, 00:13:55.792 "data_size": 63488 00:13:55.792 }, 00:13:55.792 { 00:13:55.792 "name": "BaseBdev2", 00:13:55.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.792 "is_configured": false, 00:13:55.792 "data_offset": 0, 00:13:55.792 "data_size": 0 00:13:55.792 }, 00:13:55.792 { 00:13:55.792 "name": "BaseBdev3", 00:13:55.792 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:55.792 "is_configured": false, 00:13:55.792 "data_offset": 0, 00:13:55.792 "data_size": 0 00:13:55.792 } 00:13:55.792 ] 00:13:55.792 }' 00:13:55.792 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:55.792 15:53:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:56.394 15:53:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:13:56.653 [2024-06-10 15:53:02.091537] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:13:56.653 BaseBdev2 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:56.653 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:56.911 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:13:57.169 [ 00:13:57.169 { 00:13:57.169 "name": "BaseBdev2", 00:13:57.169 "aliases": [ 00:13:57.169 "bc166caa-152d-4e6b-b273-5029f4bb2163" 00:13:57.169 ], 00:13:57.169 "product_name": "Malloc disk", 00:13:57.169 "block_size": 512, 00:13:57.169 "num_blocks": 65536, 00:13:57.169 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:13:57.169 "assigned_rate_limits": { 00:13:57.169 "rw_ios_per_sec": 0, 00:13:57.169 "rw_mbytes_per_sec": 0, 00:13:57.169 "r_mbytes_per_sec": 0, 00:13:57.169 "w_mbytes_per_sec": 0 00:13:57.169 }, 00:13:57.169 "claimed": true, 00:13:57.169 "claim_type": "exclusive_write", 00:13:57.169 "zoned": false, 00:13:57.169 "supported_io_types": { 00:13:57.169 "read": true, 00:13:57.169 "write": true, 00:13:57.169 "unmap": true, 00:13:57.169 "write_zeroes": true, 00:13:57.169 "flush": true, 00:13:57.169 "reset": true, 00:13:57.169 "compare": false, 00:13:57.169 "compare_and_write": false, 00:13:57.169 "abort": true, 00:13:57.169 "nvme_admin": false, 00:13:57.169 "nvme_io": false 00:13:57.169 }, 00:13:57.169 "memory_domains": [ 00:13:57.169 { 00:13:57.169 "dma_device_id": "system", 00:13:57.169 "dma_device_type": 1 00:13:57.169 }, 00:13:57.169 { 00:13:57.169 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:57.169 "dma_device_type": 2 00:13:57.169 } 00:13:57.169 ], 00:13:57.169 "driver_specific": {} 00:13:57.169 } 00:13:57.169 ] 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:57.169 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:57.428 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:57.428 "name": "Existed_Raid", 00:13:57.428 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:13:57.428 "strip_size_kb": 64, 00:13:57.428 "state": "configuring", 00:13:57.428 "raid_level": "concat", 00:13:57.428 "superblock": true, 00:13:57.428 "num_base_bdevs": 3, 00:13:57.428 "num_base_bdevs_discovered": 2, 00:13:57.428 "num_base_bdevs_operational": 3, 00:13:57.428 "base_bdevs_list": [ 00:13:57.428 { 00:13:57.428 "name": "BaseBdev1", 00:13:57.428 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:57.428 "is_configured": true, 00:13:57.429 "data_offset": 2048, 00:13:57.429 "data_size": 63488 00:13:57.429 }, 00:13:57.429 { 00:13:57.429 "name": "BaseBdev2", 00:13:57.429 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:13:57.429 "is_configured": true, 00:13:57.429 "data_offset": 2048, 00:13:57.429 "data_size": 63488 00:13:57.429 }, 00:13:57.429 { 00:13:57.429 "name": "BaseBdev3", 00:13:57.429 "uuid": "00000000-0000-0000-0000-000000000000", 00:13:57.429 "is_configured": false, 00:13:57.429 "data_offset": 0, 00:13:57.429 "data_size": 0 00:13:57.429 } 00:13:57.429 ] 00:13:57.429 }' 00:13:57.429 15:53:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:57.429 15:53:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:57.994 15:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:13:58.252 [2024-06-10 15:53:03.743187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:13:58.252 [2024-06-10 15:53:03.743338] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x188e8c0 00:13:58.252 [2024-06-10 15:53:03.743350] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:13:58.252 [2024-06-10 15:53:03.743535] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18a5830 00:13:58.252 [2024-06-10 15:53:03.743657] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x188e8c0 00:13:58.252 [2024-06-10 15:53:03.743665] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x188e8c0 00:13:58.252 [2024-06-10 15:53:03.743759] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:13:58.252 BaseBdev3 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:13:58.511 15:53:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:13:58.511 15:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:13:58.770 [ 00:13:58.770 { 00:13:58.770 "name": "BaseBdev3", 00:13:58.770 "aliases": [ 00:13:58.770 "79c1200d-68bb-431f-a754-44519ddc833d" 00:13:58.770 ], 00:13:58.770 "product_name": "Malloc disk", 00:13:58.770 "block_size": 512, 00:13:58.770 "num_blocks": 65536, 00:13:58.770 "uuid": "79c1200d-68bb-431f-a754-44519ddc833d", 00:13:58.770 "assigned_rate_limits": { 00:13:58.770 "rw_ios_per_sec": 0, 00:13:58.770 "rw_mbytes_per_sec": 0, 00:13:58.770 "r_mbytes_per_sec": 0, 00:13:58.770 "w_mbytes_per_sec": 0 00:13:58.770 }, 00:13:58.770 "claimed": true, 00:13:58.770 "claim_type": "exclusive_write", 00:13:58.770 "zoned": false, 00:13:58.770 "supported_io_types": { 00:13:58.770 "read": true, 00:13:58.770 "write": true, 00:13:58.770 "unmap": true, 00:13:58.770 "write_zeroes": true, 00:13:58.770 "flush": true, 00:13:58.770 "reset": true, 00:13:58.770 "compare": false, 00:13:58.770 "compare_and_write": false, 00:13:58.770 "abort": true, 00:13:58.770 "nvme_admin": false, 00:13:58.770 "nvme_io": false 00:13:58.770 }, 00:13:58.770 "memory_domains": [ 00:13:58.770 { 00:13:58.770 "dma_device_id": "system", 00:13:58.770 "dma_device_type": 1 00:13:58.770 }, 00:13:58.770 { 00:13:58.770 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:58.770 "dma_device_type": 2 00:13:58.770 } 00:13:58.770 ], 00:13:58.770 "driver_specific": {} 00:13:58.770 } 00:13:58.770 ] 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:13:58.770 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:13:59.029 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:13:59.029 "name": "Existed_Raid", 00:13:59.029 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:13:59.029 "strip_size_kb": 64, 00:13:59.029 "state": "online", 00:13:59.029 "raid_level": "concat", 00:13:59.029 "superblock": true, 00:13:59.029 "num_base_bdevs": 3, 00:13:59.029 "num_base_bdevs_discovered": 3, 00:13:59.029 "num_base_bdevs_operational": 3, 00:13:59.029 "base_bdevs_list": [ 00:13:59.029 { 00:13:59.029 "name": "BaseBdev1", 00:13:59.029 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:59.029 "is_configured": true, 00:13:59.029 "data_offset": 2048, 00:13:59.029 "data_size": 63488 00:13:59.029 }, 00:13:59.029 { 00:13:59.029 "name": "BaseBdev2", 00:13:59.029 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:13:59.029 "is_configured": true, 00:13:59.029 "data_offset": 2048, 00:13:59.029 "data_size": 63488 00:13:59.029 }, 00:13:59.029 { 00:13:59.029 "name": "BaseBdev3", 00:13:59.029 "uuid": "79c1200d-68bb-431f-a754-44519ddc833d", 00:13:59.029 "is_configured": true, 00:13:59.029 "data_offset": 2048, 00:13:59.029 "data_size": 63488 00:13:59.029 } 00:13:59.029 ] 00:13:59.029 }' 00:13:59.029 15:53:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:13:59.029 15:53:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:13:59.599 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:13:59.858 [2024-06-10 15:53:05.267562] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:13:59.858 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:13:59.858 "name": "Existed_Raid", 00:13:59.858 "aliases": [ 00:13:59.858 "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8" 00:13:59.858 ], 00:13:59.858 "product_name": "Raid Volume", 00:13:59.858 "block_size": 512, 00:13:59.858 "num_blocks": 190464, 00:13:59.858 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:13:59.858 "assigned_rate_limits": { 00:13:59.858 "rw_ios_per_sec": 0, 00:13:59.858 "rw_mbytes_per_sec": 0, 00:13:59.858 "r_mbytes_per_sec": 0, 00:13:59.858 "w_mbytes_per_sec": 0 00:13:59.858 }, 00:13:59.858 "claimed": false, 00:13:59.858 "zoned": false, 00:13:59.858 "supported_io_types": { 00:13:59.858 "read": true, 00:13:59.858 "write": true, 00:13:59.858 "unmap": true, 00:13:59.858 "write_zeroes": true, 00:13:59.858 "flush": true, 00:13:59.858 "reset": true, 00:13:59.858 "compare": false, 00:13:59.858 "compare_and_write": false, 00:13:59.858 "abort": false, 00:13:59.858 "nvme_admin": false, 00:13:59.858 "nvme_io": false 00:13:59.858 }, 00:13:59.858 "memory_domains": [ 00:13:59.858 { 00:13:59.858 "dma_device_id": "system", 00:13:59.858 "dma_device_type": 1 00:13:59.858 }, 00:13:59.858 { 00:13:59.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.858 "dma_device_type": 2 00:13:59.858 }, 00:13:59.858 { 00:13:59.858 "dma_device_id": "system", 00:13:59.858 "dma_device_type": 1 00:13:59.858 }, 00:13:59.858 { 00:13:59.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.858 "dma_device_type": 2 00:13:59.858 }, 00:13:59.858 { 00:13:59.858 "dma_device_id": "system", 00:13:59.858 "dma_device_type": 1 00:13:59.858 }, 00:13:59.858 { 00:13:59.858 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:13:59.858 "dma_device_type": 2 00:13:59.858 } 00:13:59.858 ], 00:13:59.858 "driver_specific": { 00:13:59.858 "raid": { 00:13:59.858 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:13:59.858 "strip_size_kb": 64, 00:13:59.858 "state": "online", 00:13:59.858 "raid_level": "concat", 00:13:59.858 "superblock": true, 00:13:59.858 "num_base_bdevs": 3, 00:13:59.858 "num_base_bdevs_discovered": 3, 00:13:59.858 "num_base_bdevs_operational": 3, 00:13:59.858 "base_bdevs_list": [ 00:13:59.858 { 00:13:59.858 "name": "BaseBdev1", 00:13:59.858 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:13:59.858 "is_configured": true, 00:13:59.858 "data_offset": 2048, 00:13:59.858 "data_size": 63488 00:13:59.858 }, 00:13:59.858 { 00:13:59.859 "name": "BaseBdev2", 00:13:59.859 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:13:59.859 "is_configured": true, 00:13:59.859 "data_offset": 2048, 00:13:59.859 "data_size": 63488 00:13:59.859 }, 00:13:59.859 { 00:13:59.859 "name": "BaseBdev3", 00:13:59.859 "uuid": "79c1200d-68bb-431f-a754-44519ddc833d", 00:13:59.859 "is_configured": true, 00:13:59.859 "data_offset": 2048, 00:13:59.859 "data_size": 63488 00:13:59.859 } 00:13:59.859 ] 00:13:59.859 } 00:13:59.859 } 00:13:59.859 }' 00:13:59.859 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:13:59.859 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:13:59.859 BaseBdev2 00:13:59.859 BaseBdev3' 00:13:59.859 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:13:59.859 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:13:59.859 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.118 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.118 "name": "BaseBdev1", 00:14:00.118 "aliases": [ 00:14:00.118 "0b325120-bf6d-4481-af84-5cc45fd860e9" 00:14:00.118 ], 00:14:00.118 "product_name": "Malloc disk", 00:14:00.118 "block_size": 512, 00:14:00.118 "num_blocks": 65536, 00:14:00.118 "uuid": "0b325120-bf6d-4481-af84-5cc45fd860e9", 00:14:00.118 "assigned_rate_limits": { 00:14:00.118 "rw_ios_per_sec": 0, 00:14:00.118 "rw_mbytes_per_sec": 0, 00:14:00.118 "r_mbytes_per_sec": 0, 00:14:00.118 "w_mbytes_per_sec": 0 00:14:00.118 }, 00:14:00.118 "claimed": true, 00:14:00.118 "claim_type": "exclusive_write", 00:14:00.118 "zoned": false, 00:14:00.118 "supported_io_types": { 00:14:00.118 "read": true, 00:14:00.118 "write": true, 00:14:00.118 "unmap": true, 00:14:00.118 "write_zeroes": true, 00:14:00.118 "flush": true, 00:14:00.118 "reset": true, 00:14:00.118 "compare": false, 00:14:00.118 "compare_and_write": false, 00:14:00.118 "abort": true, 00:14:00.118 "nvme_admin": false, 00:14:00.118 "nvme_io": false 00:14:00.118 }, 00:14:00.118 "memory_domains": [ 00:14:00.118 { 00:14:00.118 "dma_device_id": "system", 00:14:00.118 "dma_device_type": 1 00:14:00.118 }, 00:14:00.118 { 00:14:00.118 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.118 "dma_device_type": 2 00:14:00.118 } 00:14:00.118 ], 00:14:00.118 "driver_specific": {} 00:14:00.118 }' 00:14:00.118 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:00.378 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.637 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:00.637 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:00.637 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:00.637 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:00.637 15:53:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:00.897 "name": "BaseBdev2", 00:14:00.897 "aliases": [ 00:14:00.897 "bc166caa-152d-4e6b-b273-5029f4bb2163" 00:14:00.897 ], 00:14:00.897 "product_name": "Malloc disk", 00:14:00.897 "block_size": 512, 00:14:00.897 "num_blocks": 65536, 00:14:00.897 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:14:00.897 "assigned_rate_limits": { 00:14:00.897 "rw_ios_per_sec": 0, 00:14:00.897 "rw_mbytes_per_sec": 0, 00:14:00.897 "r_mbytes_per_sec": 0, 00:14:00.897 "w_mbytes_per_sec": 0 00:14:00.897 }, 00:14:00.897 "claimed": true, 00:14:00.897 "claim_type": "exclusive_write", 00:14:00.897 "zoned": false, 00:14:00.897 "supported_io_types": { 00:14:00.897 "read": true, 00:14:00.897 "write": true, 00:14:00.897 "unmap": true, 00:14:00.897 "write_zeroes": true, 00:14:00.897 "flush": true, 00:14:00.897 "reset": true, 00:14:00.897 "compare": false, 00:14:00.897 "compare_and_write": false, 00:14:00.897 "abort": true, 00:14:00.897 "nvme_admin": false, 00:14:00.897 "nvme_io": false 00:14:00.897 }, 00:14:00.897 "memory_domains": [ 00:14:00.897 { 00:14:00.897 "dma_device_id": "system", 00:14:00.897 "dma_device_type": 1 00:14:00.897 }, 00:14:00.897 { 00:14:00.897 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:00.897 "dma_device_type": 2 00:14:00.897 } 00:14:00.897 ], 00:14:00.897 "driver_specific": {} 00:14:00.897 }' 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:00.897 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:01.156 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:01.415 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:01.415 "name": "BaseBdev3", 00:14:01.415 "aliases": [ 00:14:01.415 "79c1200d-68bb-431f-a754-44519ddc833d" 00:14:01.415 ], 00:14:01.415 "product_name": "Malloc disk", 00:14:01.415 "block_size": 512, 00:14:01.415 "num_blocks": 65536, 00:14:01.415 "uuid": "79c1200d-68bb-431f-a754-44519ddc833d", 00:14:01.415 "assigned_rate_limits": { 00:14:01.415 "rw_ios_per_sec": 0, 00:14:01.415 "rw_mbytes_per_sec": 0, 00:14:01.415 "r_mbytes_per_sec": 0, 00:14:01.415 "w_mbytes_per_sec": 0 00:14:01.415 }, 00:14:01.415 "claimed": true, 00:14:01.415 "claim_type": "exclusive_write", 00:14:01.415 "zoned": false, 00:14:01.415 "supported_io_types": { 00:14:01.415 "read": true, 00:14:01.415 "write": true, 00:14:01.415 "unmap": true, 00:14:01.415 "write_zeroes": true, 00:14:01.415 "flush": true, 00:14:01.415 "reset": true, 00:14:01.415 "compare": false, 00:14:01.415 "compare_and_write": false, 00:14:01.415 "abort": true, 00:14:01.415 "nvme_admin": false, 00:14:01.415 "nvme_io": false 00:14:01.415 }, 00:14:01.415 "memory_domains": [ 00:14:01.415 { 00:14:01.415 "dma_device_id": "system", 00:14:01.415 "dma_device_type": 1 00:14:01.415 }, 00:14:01.415 { 00:14:01.415 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:01.415 "dma_device_type": 2 00:14:01.415 } 00:14:01.415 ], 00:14:01.415 "driver_specific": {} 00:14:01.415 }' 00:14:01.415 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.415 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:01.673 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:01.673 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.673 15:53:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:01.674 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:01.674 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.674 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:01.674 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:01.674 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.932 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:01.932 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:01.932 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:02.191 [2024-06-10 15:53:07.465417] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:02.191 [2024-06-10 15:53:07.465442] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:02.191 [2024-06-10 15:53:07.465481] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 2 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:02.191 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:02.450 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:02.450 "name": "Existed_Raid", 00:14:02.450 "uuid": "5f4cefd7-4ff6-4025-bcd3-8189f03f40a8", 00:14:02.450 "strip_size_kb": 64, 00:14:02.450 "state": "offline", 00:14:02.450 "raid_level": "concat", 00:14:02.450 "superblock": true, 00:14:02.450 "num_base_bdevs": 3, 00:14:02.450 "num_base_bdevs_discovered": 2, 00:14:02.450 "num_base_bdevs_operational": 2, 00:14:02.450 "base_bdevs_list": [ 00:14:02.450 { 00:14:02.450 "name": null, 00:14:02.450 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:02.450 "is_configured": false, 00:14:02.450 "data_offset": 2048, 00:14:02.450 "data_size": 63488 00:14:02.450 }, 00:14:02.450 { 00:14:02.450 "name": "BaseBdev2", 00:14:02.450 "uuid": "bc166caa-152d-4e6b-b273-5029f4bb2163", 00:14:02.450 "is_configured": true, 00:14:02.450 "data_offset": 2048, 00:14:02.450 "data_size": 63488 00:14:02.450 }, 00:14:02.450 { 00:14:02.450 "name": "BaseBdev3", 00:14:02.450 "uuid": "79c1200d-68bb-431f-a754-44519ddc833d", 00:14:02.450 "is_configured": true, 00:14:02.450 "data_offset": 2048, 00:14:02.450 "data_size": 63488 00:14:02.450 } 00:14:02.450 ] 00:14:02.450 }' 00:14:02.450 15:53:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:02.450 15:53:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:03.018 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:14:03.018 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:03.018 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.018 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:03.277 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:03.277 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:03.277 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:14:03.536 [2024-06-10 15:53:08.874369] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:03.536 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:03.536 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:03.536 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:03.536 15:53:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:14:03.795 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:14:03.795 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:14:03.795 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:14:04.054 [2024-06-10 15:53:09.398300] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:04.054 [2024-06-10 15:53:09.398341] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x188e8c0 name Existed_Raid, state offline 00:14:04.054 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:14:04.054 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:14:04.054 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:04.054 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:04.313 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:04.572 BaseBdev2 00:14:04.572 15:53:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:14:04.572 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:04.572 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:04.572 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:04.573 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:04.573 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:04.573 15:53:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:04.832 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:05.091 [ 00:14:05.091 { 00:14:05.091 "name": "BaseBdev2", 00:14:05.091 "aliases": [ 00:14:05.091 "5aadaaba-a659-45ee-8ddf-5fe2e1df5997" 00:14:05.091 ], 00:14:05.091 "product_name": "Malloc disk", 00:14:05.091 "block_size": 512, 00:14:05.091 "num_blocks": 65536, 00:14:05.091 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:05.091 "assigned_rate_limits": { 00:14:05.091 "rw_ios_per_sec": 0, 00:14:05.091 "rw_mbytes_per_sec": 0, 00:14:05.091 "r_mbytes_per_sec": 0, 00:14:05.091 "w_mbytes_per_sec": 0 00:14:05.091 }, 00:14:05.091 "claimed": false, 00:14:05.091 "zoned": false, 00:14:05.091 "supported_io_types": { 00:14:05.091 "read": true, 00:14:05.091 "write": true, 00:14:05.091 "unmap": true, 00:14:05.091 "write_zeroes": true, 00:14:05.091 "flush": true, 00:14:05.091 "reset": true, 00:14:05.091 "compare": false, 00:14:05.091 "compare_and_write": false, 00:14:05.091 "abort": true, 00:14:05.091 "nvme_admin": false, 00:14:05.091 "nvme_io": false 00:14:05.091 }, 00:14:05.091 "memory_domains": [ 00:14:05.091 { 00:14:05.091 "dma_device_id": "system", 00:14:05.091 "dma_device_type": 1 00:14:05.091 }, 00:14:05.091 { 00:14:05.091 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.091 "dma_device_type": 2 00:14:05.091 } 00:14:05.091 ], 00:14:05.091 "driver_specific": {} 00:14:05.091 } 00:14:05.091 ] 00:14:05.091 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:05.091 15:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:05.091 15:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:05.091 15:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:05.351 BaseBdev3 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:05.351 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:05.610 15:53:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:05.868 [ 00:14:05.868 { 00:14:05.868 "name": "BaseBdev3", 00:14:05.868 "aliases": [ 00:14:05.868 "2475a993-c3be-45b1-9adc-99c4533f5fb4" 00:14:05.868 ], 00:14:05.868 "product_name": "Malloc disk", 00:14:05.868 "block_size": 512, 00:14:05.868 "num_blocks": 65536, 00:14:05.868 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:05.868 "assigned_rate_limits": { 00:14:05.868 "rw_ios_per_sec": 0, 00:14:05.868 "rw_mbytes_per_sec": 0, 00:14:05.868 "r_mbytes_per_sec": 0, 00:14:05.868 "w_mbytes_per_sec": 0 00:14:05.868 }, 00:14:05.868 "claimed": false, 00:14:05.868 "zoned": false, 00:14:05.868 "supported_io_types": { 00:14:05.868 "read": true, 00:14:05.868 "write": true, 00:14:05.868 "unmap": true, 00:14:05.868 "write_zeroes": true, 00:14:05.868 "flush": true, 00:14:05.868 "reset": true, 00:14:05.868 "compare": false, 00:14:05.868 "compare_and_write": false, 00:14:05.868 "abort": true, 00:14:05.868 "nvme_admin": false, 00:14:05.868 "nvme_io": false 00:14:05.868 }, 00:14:05.868 "memory_domains": [ 00:14:05.868 { 00:14:05.868 "dma_device_id": "system", 00:14:05.868 "dma_device_type": 1 00:14:05.868 }, 00:14:05.868 { 00:14:05.868 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:05.868 "dma_device_type": 2 00:14:05.868 } 00:14:05.868 ], 00:14:05.868 "driver_specific": {} 00:14:05.868 } 00:14:05.868 ] 00:14:05.868 15:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:05.869 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:14:05.869 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:14:05.869 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:06.127 [2024-06-10 15:53:11.450898] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:06.127 [2024-06-10 15:53:11.450933] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:06.127 [2024-06-10 15:53:11.450951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:06.127 [2024-06-10 15:53:11.452342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:06.127 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:06.128 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:06.128 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:06.386 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:06.387 "name": "Existed_Raid", 00:14:06.387 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:06.387 "strip_size_kb": 64, 00:14:06.387 "state": "configuring", 00:14:06.387 "raid_level": "concat", 00:14:06.387 "superblock": true, 00:14:06.387 "num_base_bdevs": 3, 00:14:06.387 "num_base_bdevs_discovered": 2, 00:14:06.387 "num_base_bdevs_operational": 3, 00:14:06.387 "base_bdevs_list": [ 00:14:06.387 { 00:14:06.387 "name": "BaseBdev1", 00:14:06.387 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:06.387 "is_configured": false, 00:14:06.387 "data_offset": 0, 00:14:06.387 "data_size": 0 00:14:06.387 }, 00:14:06.387 { 00:14:06.387 "name": "BaseBdev2", 00:14:06.387 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:06.387 "is_configured": true, 00:14:06.387 "data_offset": 2048, 00:14:06.387 "data_size": 63488 00:14:06.387 }, 00:14:06.387 { 00:14:06.387 "name": "BaseBdev3", 00:14:06.387 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:06.387 "is_configured": true, 00:14:06.387 "data_offset": 2048, 00:14:06.387 "data_size": 63488 00:14:06.387 } 00:14:06.387 ] 00:14:06.387 }' 00:14:06.387 15:53:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:06.387 15:53:11 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:06.956 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:14:07.216 [2024-06-10 15:53:12.589923] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:07.216 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:07.475 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:07.475 "name": "Existed_Raid", 00:14:07.475 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:07.475 "strip_size_kb": 64, 00:14:07.475 "state": "configuring", 00:14:07.475 "raid_level": "concat", 00:14:07.475 "superblock": true, 00:14:07.475 "num_base_bdevs": 3, 00:14:07.475 "num_base_bdevs_discovered": 1, 00:14:07.475 "num_base_bdevs_operational": 3, 00:14:07.475 "base_bdevs_list": [ 00:14:07.475 { 00:14:07.475 "name": "BaseBdev1", 00:14:07.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:07.475 "is_configured": false, 00:14:07.475 "data_offset": 0, 00:14:07.475 "data_size": 0 00:14:07.475 }, 00:14:07.475 { 00:14:07.475 "name": null, 00:14:07.475 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:07.475 "is_configured": false, 00:14:07.475 "data_offset": 2048, 00:14:07.475 "data_size": 63488 00:14:07.475 }, 00:14:07.475 { 00:14:07.475 "name": "BaseBdev3", 00:14:07.475 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:07.475 "is_configured": true, 00:14:07.475 "data_offset": 2048, 00:14:07.475 "data_size": 63488 00:14:07.475 } 00:14:07.475 ] 00:14:07.475 }' 00:14:07.475 15:53:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:07.475 15:53:12 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:08.043 15:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:08.043 15:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:08.301 15:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:14:08.301 15:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:08.561 [2024-06-10 15:53:13.972916] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:08.561 BaseBdev1 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:08.561 15:53:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:08.821 15:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:09.080 [ 00:14:09.080 { 00:14:09.080 "name": "BaseBdev1", 00:14:09.080 "aliases": [ 00:14:09.080 "dfa0a382-8df5-4ca6-adc0-7080330d65a8" 00:14:09.080 ], 00:14:09.080 "product_name": "Malloc disk", 00:14:09.080 "block_size": 512, 00:14:09.080 "num_blocks": 65536, 00:14:09.080 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:09.080 "assigned_rate_limits": { 00:14:09.080 "rw_ios_per_sec": 0, 00:14:09.080 "rw_mbytes_per_sec": 0, 00:14:09.080 "r_mbytes_per_sec": 0, 00:14:09.080 "w_mbytes_per_sec": 0 00:14:09.080 }, 00:14:09.080 "claimed": true, 00:14:09.080 "claim_type": "exclusive_write", 00:14:09.080 "zoned": false, 00:14:09.080 "supported_io_types": { 00:14:09.080 "read": true, 00:14:09.080 "write": true, 00:14:09.080 "unmap": true, 00:14:09.080 "write_zeroes": true, 00:14:09.080 "flush": true, 00:14:09.080 "reset": true, 00:14:09.080 "compare": false, 00:14:09.080 "compare_and_write": false, 00:14:09.080 "abort": true, 00:14:09.080 "nvme_admin": false, 00:14:09.080 "nvme_io": false 00:14:09.080 }, 00:14:09.080 "memory_domains": [ 00:14:09.080 { 00:14:09.080 "dma_device_id": "system", 00:14:09.080 "dma_device_type": 1 00:14:09.080 }, 00:14:09.080 { 00:14:09.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:09.080 "dma_device_type": 2 00:14:09.080 } 00:14:09.080 ], 00:14:09.080 "driver_specific": {} 00:14:09.080 } 00:14:09.080 ] 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.080 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:09.339 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:09.339 "name": "Existed_Raid", 00:14:09.339 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:09.339 "strip_size_kb": 64, 00:14:09.339 "state": "configuring", 00:14:09.339 "raid_level": "concat", 00:14:09.339 "superblock": true, 00:14:09.339 "num_base_bdevs": 3, 00:14:09.339 "num_base_bdevs_discovered": 2, 00:14:09.339 "num_base_bdevs_operational": 3, 00:14:09.339 "base_bdevs_list": [ 00:14:09.339 { 00:14:09.339 "name": "BaseBdev1", 00:14:09.339 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:09.339 "is_configured": true, 00:14:09.339 "data_offset": 2048, 00:14:09.339 "data_size": 63488 00:14:09.339 }, 00:14:09.339 { 00:14:09.339 "name": null, 00:14:09.339 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:09.339 "is_configured": false, 00:14:09.339 "data_offset": 2048, 00:14:09.339 "data_size": 63488 00:14:09.339 }, 00:14:09.339 { 00:14:09.339 "name": "BaseBdev3", 00:14:09.339 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:09.339 "is_configured": true, 00:14:09.339 "data_offset": 2048, 00:14:09.339 "data_size": 63488 00:14:09.339 } 00:14:09.339 ] 00:14:09.339 }' 00:14:09.339 15:53:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:09.339 15:53:14 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:09.907 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:09.907 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:10.166 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:14:10.166 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:14:10.464 [2024-06-10 15:53:15.845936] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:10.464 15:53:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:10.724 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:10.724 "name": "Existed_Raid", 00:14:10.724 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:10.724 "strip_size_kb": 64, 00:14:10.724 "state": "configuring", 00:14:10.724 "raid_level": "concat", 00:14:10.724 "superblock": true, 00:14:10.724 "num_base_bdevs": 3, 00:14:10.724 "num_base_bdevs_discovered": 1, 00:14:10.724 "num_base_bdevs_operational": 3, 00:14:10.724 "base_bdevs_list": [ 00:14:10.724 { 00:14:10.724 "name": "BaseBdev1", 00:14:10.724 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:10.724 "is_configured": true, 00:14:10.724 "data_offset": 2048, 00:14:10.724 "data_size": 63488 00:14:10.724 }, 00:14:10.724 { 00:14:10.724 "name": null, 00:14:10.724 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:10.724 "is_configured": false, 00:14:10.724 "data_offset": 2048, 00:14:10.724 "data_size": 63488 00:14:10.724 }, 00:14:10.724 { 00:14:10.724 "name": null, 00:14:10.724 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:10.724 "is_configured": false, 00:14:10.724 "data_offset": 2048, 00:14:10.724 "data_size": 63488 00:14:10.724 } 00:14:10.724 ] 00:14:10.724 }' 00:14:10.724 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:10.724 15:53:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:11.292 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.292 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:11.551 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:14:11.551 15:53:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:14:11.810 [2024-06-10 15:53:17.233694] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:11.810 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:12.069 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:12.069 "name": "Existed_Raid", 00:14:12.069 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:12.069 "strip_size_kb": 64, 00:14:12.069 "state": "configuring", 00:14:12.069 "raid_level": "concat", 00:14:12.069 "superblock": true, 00:14:12.069 "num_base_bdevs": 3, 00:14:12.069 "num_base_bdevs_discovered": 2, 00:14:12.069 "num_base_bdevs_operational": 3, 00:14:12.069 "base_bdevs_list": [ 00:14:12.069 { 00:14:12.069 "name": "BaseBdev1", 00:14:12.069 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:12.069 "is_configured": true, 00:14:12.069 "data_offset": 2048, 00:14:12.069 "data_size": 63488 00:14:12.069 }, 00:14:12.069 { 00:14:12.070 "name": null, 00:14:12.070 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:12.070 "is_configured": false, 00:14:12.070 "data_offset": 2048, 00:14:12.070 "data_size": 63488 00:14:12.070 }, 00:14:12.070 { 00:14:12.070 "name": "BaseBdev3", 00:14:12.070 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:12.070 "is_configured": true, 00:14:12.070 "data_offset": 2048, 00:14:12.070 "data_size": 63488 00:14:12.070 } 00:14:12.070 ] 00:14:12.070 }' 00:14:12.070 15:53:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:12.070 15:53:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:12.637 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:12.637 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:14:12.896 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:14:12.896 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:13.155 [2024-06-10 15:53:18.537227] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:13.155 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:13.413 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:13.413 "name": "Existed_Raid", 00:14:13.413 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:13.413 "strip_size_kb": 64, 00:14:13.413 "state": "configuring", 00:14:13.413 "raid_level": "concat", 00:14:13.413 "superblock": true, 00:14:13.413 "num_base_bdevs": 3, 00:14:13.413 "num_base_bdevs_discovered": 1, 00:14:13.413 "num_base_bdevs_operational": 3, 00:14:13.413 "base_bdevs_list": [ 00:14:13.413 { 00:14:13.413 "name": null, 00:14:13.413 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:13.413 "is_configured": false, 00:14:13.413 "data_offset": 2048, 00:14:13.413 "data_size": 63488 00:14:13.413 }, 00:14:13.413 { 00:14:13.413 "name": null, 00:14:13.413 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:13.413 "is_configured": false, 00:14:13.413 "data_offset": 2048, 00:14:13.413 "data_size": 63488 00:14:13.413 }, 00:14:13.413 { 00:14:13.413 "name": "BaseBdev3", 00:14:13.413 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:13.413 "is_configured": true, 00:14:13.413 "data_offset": 2048, 00:14:13.413 "data_size": 63488 00:14:13.413 } 00:14:13.413 ] 00:14:13.413 }' 00:14:13.413 15:53:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:13.413 15:53:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:13.980 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:14:13.980 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.238 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:14:14.238 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:14:14.496 [2024-06-10 15:53:19.839066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 3 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:14.496 15:53:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:14.754 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:14.754 "name": "Existed_Raid", 00:14:14.754 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:14.754 "strip_size_kb": 64, 00:14:14.754 "state": "configuring", 00:14:14.754 "raid_level": "concat", 00:14:14.754 "superblock": true, 00:14:14.754 "num_base_bdevs": 3, 00:14:14.754 "num_base_bdevs_discovered": 2, 00:14:14.754 "num_base_bdevs_operational": 3, 00:14:14.754 "base_bdevs_list": [ 00:14:14.754 { 00:14:14.754 "name": null, 00:14:14.754 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:14.754 "is_configured": false, 00:14:14.754 "data_offset": 2048, 00:14:14.754 "data_size": 63488 00:14:14.754 }, 00:14:14.754 { 00:14:14.754 "name": "BaseBdev2", 00:14:14.754 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:14.754 "is_configured": true, 00:14:14.754 "data_offset": 2048, 00:14:14.754 "data_size": 63488 00:14:14.754 }, 00:14:14.754 { 00:14:14.754 "name": "BaseBdev3", 00:14:14.754 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:14.754 "is_configured": true, 00:14:14.754 "data_offset": 2048, 00:14:14.754 "data_size": 63488 00:14:14.754 } 00:14:14.754 ] 00:14:14.754 }' 00:14:14.754 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:14.754 15:53:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:15.320 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.320 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:14:15.578 15:53:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:14:15.578 15:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:14:15.578 15:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:15.837 15:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u dfa0a382-8df5-4ca6-adc0-7080330d65a8 00:14:16.095 [2024-06-10 15:53:21.502821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:14:16.095 [2024-06-10 15:53:21.502976] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1a3d550 00:14:16.095 [2024-06-10 15:53:21.502989] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:16.095 [2024-06-10 15:53:21.503176] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1a3e560 00:14:16.095 [2024-06-10 15:53:21.503292] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1a3d550 00:14:16.095 [2024-06-10 15:53:21.503300] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1a3d550 00:14:16.095 [2024-06-10 15:53:21.503397] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:16.095 NewBaseBdev 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:16.095 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:16.353 15:53:21 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:14:16.612 [ 00:14:16.612 { 00:14:16.612 "name": "NewBaseBdev", 00:14:16.612 "aliases": [ 00:14:16.612 "dfa0a382-8df5-4ca6-adc0-7080330d65a8" 00:14:16.612 ], 00:14:16.612 "product_name": "Malloc disk", 00:14:16.612 "block_size": 512, 00:14:16.612 "num_blocks": 65536, 00:14:16.612 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:16.612 "assigned_rate_limits": { 00:14:16.612 "rw_ios_per_sec": 0, 00:14:16.612 "rw_mbytes_per_sec": 0, 00:14:16.612 "r_mbytes_per_sec": 0, 00:14:16.612 "w_mbytes_per_sec": 0 00:14:16.612 }, 00:14:16.612 "claimed": true, 00:14:16.612 "claim_type": "exclusive_write", 00:14:16.612 "zoned": false, 00:14:16.612 "supported_io_types": { 00:14:16.612 "read": true, 00:14:16.612 "write": true, 00:14:16.612 "unmap": true, 00:14:16.612 "write_zeroes": true, 00:14:16.612 "flush": true, 00:14:16.612 "reset": true, 00:14:16.612 "compare": false, 00:14:16.612 "compare_and_write": false, 00:14:16.612 "abort": true, 00:14:16.612 "nvme_admin": false, 00:14:16.612 "nvme_io": false 00:14:16.612 }, 00:14:16.612 "memory_domains": [ 00:14:16.612 { 00:14:16.612 "dma_device_id": "system", 00:14:16.612 "dma_device_type": 1 00:14:16.612 }, 00:14:16.612 { 00:14:16.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:16.612 "dma_device_type": 2 00:14:16.612 } 00:14:16.612 ], 00:14:16.612 "driver_specific": {} 00:14:16.612 } 00:14:16.612 ] 00:14:16.612 15:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 3 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:16.613 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:16.871 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:16.871 "name": "Existed_Raid", 00:14:16.871 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:16.871 "strip_size_kb": 64, 00:14:16.871 "state": "online", 00:14:16.871 "raid_level": "concat", 00:14:16.871 "superblock": true, 00:14:16.871 "num_base_bdevs": 3, 00:14:16.871 "num_base_bdevs_discovered": 3, 00:14:16.871 "num_base_bdevs_operational": 3, 00:14:16.871 "base_bdevs_list": [ 00:14:16.871 { 00:14:16.871 "name": "NewBaseBdev", 00:14:16.871 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:16.871 "is_configured": true, 00:14:16.871 "data_offset": 2048, 00:14:16.871 "data_size": 63488 00:14:16.871 }, 00:14:16.871 { 00:14:16.871 "name": "BaseBdev2", 00:14:16.871 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:16.871 "is_configured": true, 00:14:16.871 "data_offset": 2048, 00:14:16.871 "data_size": 63488 00:14:16.871 }, 00:14:16.871 { 00:14:16.871 "name": "BaseBdev3", 00:14:16.871 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:16.871 "is_configured": true, 00:14:16.871 "data_offset": 2048, 00:14:16.871 "data_size": 63488 00:14:16.871 } 00:14:16.871 ] 00:14:16.871 }' 00:14:16.871 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:16.871 15:53:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:17.437 15:53:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:17.695 [2024-06-10 15:53:23.003147] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:17.695 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:17.695 "name": "Existed_Raid", 00:14:17.695 "aliases": [ 00:14:17.695 "8ac5e413-7f31-44f0-8a7b-662021f6af9d" 00:14:17.695 ], 00:14:17.695 "product_name": "Raid Volume", 00:14:17.695 "block_size": 512, 00:14:17.695 "num_blocks": 190464, 00:14:17.695 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:17.695 "assigned_rate_limits": { 00:14:17.695 "rw_ios_per_sec": 0, 00:14:17.695 "rw_mbytes_per_sec": 0, 00:14:17.695 "r_mbytes_per_sec": 0, 00:14:17.695 "w_mbytes_per_sec": 0 00:14:17.695 }, 00:14:17.695 "claimed": false, 00:14:17.695 "zoned": false, 00:14:17.695 "supported_io_types": { 00:14:17.695 "read": true, 00:14:17.695 "write": true, 00:14:17.695 "unmap": true, 00:14:17.695 "write_zeroes": true, 00:14:17.695 "flush": true, 00:14:17.695 "reset": true, 00:14:17.695 "compare": false, 00:14:17.695 "compare_and_write": false, 00:14:17.695 "abort": false, 00:14:17.695 "nvme_admin": false, 00:14:17.695 "nvme_io": false 00:14:17.695 }, 00:14:17.695 "memory_domains": [ 00:14:17.695 { 00:14:17.695 "dma_device_id": "system", 00:14:17.695 "dma_device_type": 1 00:14:17.695 }, 00:14:17.695 { 00:14:17.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.695 "dma_device_type": 2 00:14:17.695 }, 00:14:17.695 { 00:14:17.695 "dma_device_id": "system", 00:14:17.695 "dma_device_type": 1 00:14:17.695 }, 00:14:17.695 { 00:14:17.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.695 "dma_device_type": 2 00:14:17.695 }, 00:14:17.695 { 00:14:17.695 "dma_device_id": "system", 00:14:17.695 "dma_device_type": 1 00:14:17.695 }, 00:14:17.695 { 00:14:17.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.695 "dma_device_type": 2 00:14:17.695 } 00:14:17.695 ], 00:14:17.695 "driver_specific": { 00:14:17.695 "raid": { 00:14:17.695 "uuid": "8ac5e413-7f31-44f0-8a7b-662021f6af9d", 00:14:17.696 "strip_size_kb": 64, 00:14:17.696 "state": "online", 00:14:17.696 "raid_level": "concat", 00:14:17.696 "superblock": true, 00:14:17.696 "num_base_bdevs": 3, 00:14:17.696 "num_base_bdevs_discovered": 3, 00:14:17.696 "num_base_bdevs_operational": 3, 00:14:17.696 "base_bdevs_list": [ 00:14:17.696 { 00:14:17.696 "name": "NewBaseBdev", 00:14:17.696 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:17.696 "is_configured": true, 00:14:17.696 "data_offset": 2048, 00:14:17.696 "data_size": 63488 00:14:17.696 }, 00:14:17.696 { 00:14:17.696 "name": "BaseBdev2", 00:14:17.696 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:17.696 "is_configured": true, 00:14:17.696 "data_offset": 2048, 00:14:17.696 "data_size": 63488 00:14:17.696 }, 00:14:17.696 { 00:14:17.696 "name": "BaseBdev3", 00:14:17.696 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:17.696 "is_configured": true, 00:14:17.696 "data_offset": 2048, 00:14:17.696 "data_size": 63488 00:14:17.696 } 00:14:17.696 ] 00:14:17.696 } 00:14:17.696 } 00:14:17.696 }' 00:14:17.696 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:17.696 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:14:17.696 BaseBdev2 00:14:17.696 BaseBdev3' 00:14:17.696 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:17.696 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:14:17.696 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:17.954 "name": "NewBaseBdev", 00:14:17.954 "aliases": [ 00:14:17.954 "dfa0a382-8df5-4ca6-adc0-7080330d65a8" 00:14:17.954 ], 00:14:17.954 "product_name": "Malloc disk", 00:14:17.954 "block_size": 512, 00:14:17.954 "num_blocks": 65536, 00:14:17.954 "uuid": "dfa0a382-8df5-4ca6-adc0-7080330d65a8", 00:14:17.954 "assigned_rate_limits": { 00:14:17.954 "rw_ios_per_sec": 0, 00:14:17.954 "rw_mbytes_per_sec": 0, 00:14:17.954 "r_mbytes_per_sec": 0, 00:14:17.954 "w_mbytes_per_sec": 0 00:14:17.954 }, 00:14:17.954 "claimed": true, 00:14:17.954 "claim_type": "exclusive_write", 00:14:17.954 "zoned": false, 00:14:17.954 "supported_io_types": { 00:14:17.954 "read": true, 00:14:17.954 "write": true, 00:14:17.954 "unmap": true, 00:14:17.954 "write_zeroes": true, 00:14:17.954 "flush": true, 00:14:17.954 "reset": true, 00:14:17.954 "compare": false, 00:14:17.954 "compare_and_write": false, 00:14:17.954 "abort": true, 00:14:17.954 "nvme_admin": false, 00:14:17.954 "nvme_io": false 00:14:17.954 }, 00:14:17.954 "memory_domains": [ 00:14:17.954 { 00:14:17.954 "dma_device_id": "system", 00:14:17.954 "dma_device_type": 1 00:14:17.954 }, 00:14:17.954 { 00:14:17.954 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:17.954 "dma_device_type": 2 00:14:17.954 } 00:14:17.954 ], 00:14:17.954 "driver_specific": {} 00:14:17.954 }' 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:17.954 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:18.213 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.471 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.471 "name": "BaseBdev2", 00:14:18.471 "aliases": [ 00:14:18.471 "5aadaaba-a659-45ee-8ddf-5fe2e1df5997" 00:14:18.472 ], 00:14:18.472 "product_name": "Malloc disk", 00:14:18.472 "block_size": 512, 00:14:18.472 "num_blocks": 65536, 00:14:18.472 "uuid": "5aadaaba-a659-45ee-8ddf-5fe2e1df5997", 00:14:18.472 "assigned_rate_limits": { 00:14:18.472 "rw_ios_per_sec": 0, 00:14:18.472 "rw_mbytes_per_sec": 0, 00:14:18.472 "r_mbytes_per_sec": 0, 00:14:18.472 "w_mbytes_per_sec": 0 00:14:18.472 }, 00:14:18.472 "claimed": true, 00:14:18.472 "claim_type": "exclusive_write", 00:14:18.472 "zoned": false, 00:14:18.472 "supported_io_types": { 00:14:18.472 "read": true, 00:14:18.472 "write": true, 00:14:18.472 "unmap": true, 00:14:18.472 "write_zeroes": true, 00:14:18.472 "flush": true, 00:14:18.472 "reset": true, 00:14:18.472 "compare": false, 00:14:18.472 "compare_and_write": false, 00:14:18.472 "abort": true, 00:14:18.472 "nvme_admin": false, 00:14:18.472 "nvme_io": false 00:14:18.472 }, 00:14:18.472 "memory_domains": [ 00:14:18.472 { 00:14:18.472 "dma_device_id": "system", 00:14:18.472 "dma_device_type": 1 00:14:18.472 }, 00:14:18.472 { 00:14:18.472 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.472 "dma_device_type": 2 00:14:18.472 } 00:14:18.472 ], 00:14:18.472 "driver_specific": {} 00:14:18.472 }' 00:14:18.472 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.472 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.472 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:18.472 15:53:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:18.730 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:18.988 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:18.988 "name": "BaseBdev3", 00:14:18.988 "aliases": [ 00:14:18.988 "2475a993-c3be-45b1-9adc-99c4533f5fb4" 00:14:18.988 ], 00:14:18.988 "product_name": "Malloc disk", 00:14:18.988 "block_size": 512, 00:14:18.988 "num_blocks": 65536, 00:14:18.988 "uuid": "2475a993-c3be-45b1-9adc-99c4533f5fb4", 00:14:18.988 "assigned_rate_limits": { 00:14:18.988 "rw_ios_per_sec": 0, 00:14:18.988 "rw_mbytes_per_sec": 0, 00:14:18.988 "r_mbytes_per_sec": 0, 00:14:18.988 "w_mbytes_per_sec": 0 00:14:18.988 }, 00:14:18.988 "claimed": true, 00:14:18.988 "claim_type": "exclusive_write", 00:14:18.988 "zoned": false, 00:14:18.988 "supported_io_types": { 00:14:18.988 "read": true, 00:14:18.988 "write": true, 00:14:18.988 "unmap": true, 00:14:18.988 "write_zeroes": true, 00:14:18.989 "flush": true, 00:14:18.989 "reset": true, 00:14:18.989 "compare": false, 00:14:18.989 "compare_and_write": false, 00:14:18.989 "abort": true, 00:14:18.989 "nvme_admin": false, 00:14:18.989 "nvme_io": false 00:14:18.989 }, 00:14:18.989 "memory_domains": [ 00:14:18.989 { 00:14:18.989 "dma_device_id": "system", 00:14:18.989 "dma_device_type": 1 00:14:18.989 }, 00:14:18.989 { 00:14:18.989 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:18.989 "dma_device_type": 2 00:14:18.989 } 00:14:18.989 ], 00:14:18.989 "driver_specific": {} 00:14:18.989 }' 00:14:18.989 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:18.989 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:19.247 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.248 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:19.506 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:19.506 15:53:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:19.506 [2024-06-10 15:53:24.996225] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:19.506 [2024-06-10 15:53:24.996250] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:19.506 [2024-06-10 15:53:24.996300] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:19.506 [2024-06-10 15:53:24.996351] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:19.506 [2024-06-10 15:53:24.996360] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1a3d550 name Existed_Raid, state offline 00:14:19.506 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2678437 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2678437 ']' 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2678437 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2678437 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2678437' 00:14:19.765 killing process with pid 2678437 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2678437 00:14:19.765 [2024-06-10 15:53:25.044099] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2678437 00:14:19.765 [2024-06-10 15:53:25.069716] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:14:19.765 00:14:19.765 real 0m29.059s 00:14:19.765 user 0m54.390s 00:14:19.765 sys 0m4.136s 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:19.765 15:53:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:14:19.765 ************************************ 00:14:19.765 END TEST raid_state_function_test_sb 00:14:19.765 ************************************ 00:14:20.024 15:53:25 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 3 00:14:20.024 15:53:25 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:14:20.024 15:53:25 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:20.024 15:53:25 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:20.024 ************************************ 00:14:20.024 START TEST raid_superblock_test 00:14:20.024 ************************************ 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 3 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2683811 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2683811 /var/tmp/spdk-raid.sock 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2683811 ']' 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:20.024 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:20.024 15:53:25 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:20.024 [2024-06-10 15:53:25.393905] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:14:20.024 [2024-06-10 15:53:25.393962] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2683811 ] 00:14:20.024 [2024-06-10 15:53:25.491711] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:20.283 [2024-06-10 15:53:25.587800] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:20.283 [2024-06-10 15:53:25.649027] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:20.283 [2024-06-10 15:53:25.649057] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:20.850 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:14:21.108 malloc1 00:14:21.108 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:21.407 [2024-06-10 15:53:26.846483] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:21.407 [2024-06-10 15:53:26.846527] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.407 [2024-06-10 15:53:26.846546] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x23750f0 00:14:21.407 [2024-06-10 15:53:26.846556] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.407 [2024-06-10 15:53:26.848262] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.407 [2024-06-10 15:53:26.848291] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:21.407 pt1 00:14:21.407 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:21.408 15:53:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:14:21.666 malloc2 00:14:21.666 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:21.925 [2024-06-10 15:53:27.360684] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:21.925 [2024-06-10 15:53:27.360727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:21.925 [2024-06-10 15:53:27.360742] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2376400 00:14:21.925 [2024-06-10 15:53:27.360752] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:21.925 [2024-06-10 15:53:27.362303] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:21.925 [2024-06-10 15:53:27.362329] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:21.925 pt2 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:14:21.925 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:14:22.183 malloc3 00:14:22.183 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:22.442 [2024-06-10 15:53:27.878523] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:22.442 [2024-06-10 15:53:27.878568] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:22.442 [2024-06-10 15:53:27.878583] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2522200 00:14:22.442 [2024-06-10 15:53:27.878593] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:22.442 [2024-06-10 15:53:27.880213] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:22.442 [2024-06-10 15:53:27.880242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:22.442 pt3 00:14:22.442 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:14:22.442 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:14:22.443 15:53:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:14:22.702 [2024-06-10 15:53:28.131223] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:22.702 [2024-06-10 15:53:28.132557] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:22.702 [2024-06-10 15:53:28.132613] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:22.702 [2024-06-10 15:53:28.132774] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x25209f0 00:14:22.702 [2024-06-10 15:53:28.132786] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:22.702 [2024-06-10 15:53:28.133009] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2521830 00:14:22.702 [2024-06-10 15:53:28.133157] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x25209f0 00:14:22.702 [2024-06-10 15:53:28.133166] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x25209f0 00:14:22.702 [2024-06-10 15:53:28.133264] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:22.702 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:22.962 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:22.962 "name": "raid_bdev1", 00:14:22.962 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:22.962 "strip_size_kb": 64, 00:14:22.962 "state": "online", 00:14:22.962 "raid_level": "concat", 00:14:22.962 "superblock": true, 00:14:22.962 "num_base_bdevs": 3, 00:14:22.962 "num_base_bdevs_discovered": 3, 00:14:22.962 "num_base_bdevs_operational": 3, 00:14:22.962 "base_bdevs_list": [ 00:14:22.962 { 00:14:22.962 "name": "pt1", 00:14:22.962 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:22.962 "is_configured": true, 00:14:22.962 "data_offset": 2048, 00:14:22.962 "data_size": 63488 00:14:22.962 }, 00:14:22.962 { 00:14:22.962 "name": "pt2", 00:14:22.962 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:22.962 "is_configured": true, 00:14:22.962 "data_offset": 2048, 00:14:22.962 "data_size": 63488 00:14:22.962 }, 00:14:22.962 { 00:14:22.962 "name": "pt3", 00:14:22.962 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:22.962 "is_configured": true, 00:14:22.962 "data_offset": 2048, 00:14:22.962 "data_size": 63488 00:14:22.962 } 00:14:22.962 ] 00:14:22.962 }' 00:14:22.962 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:22.962 15:53:28 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:23.530 15:53:28 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:23.789 [2024-06-10 15:53:29.049906] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:23.789 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:23.789 "name": "raid_bdev1", 00:14:23.789 "aliases": [ 00:14:23.789 "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7" 00:14:23.789 ], 00:14:23.789 "product_name": "Raid Volume", 00:14:23.789 "block_size": 512, 00:14:23.789 "num_blocks": 190464, 00:14:23.789 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:23.789 "assigned_rate_limits": { 00:14:23.789 "rw_ios_per_sec": 0, 00:14:23.789 "rw_mbytes_per_sec": 0, 00:14:23.789 "r_mbytes_per_sec": 0, 00:14:23.789 "w_mbytes_per_sec": 0 00:14:23.789 }, 00:14:23.789 "claimed": false, 00:14:23.789 "zoned": false, 00:14:23.790 "supported_io_types": { 00:14:23.790 "read": true, 00:14:23.790 "write": true, 00:14:23.790 "unmap": true, 00:14:23.790 "write_zeroes": true, 00:14:23.790 "flush": true, 00:14:23.790 "reset": true, 00:14:23.790 "compare": false, 00:14:23.790 "compare_and_write": false, 00:14:23.790 "abort": false, 00:14:23.790 "nvme_admin": false, 00:14:23.790 "nvme_io": false 00:14:23.790 }, 00:14:23.790 "memory_domains": [ 00:14:23.790 { 00:14:23.790 "dma_device_id": "system", 00:14:23.790 "dma_device_type": 1 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.790 "dma_device_type": 2 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "dma_device_id": "system", 00:14:23.790 "dma_device_type": 1 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.790 "dma_device_type": 2 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "dma_device_id": "system", 00:14:23.790 "dma_device_type": 1 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:23.790 "dma_device_type": 2 00:14:23.790 } 00:14:23.790 ], 00:14:23.790 "driver_specific": { 00:14:23.790 "raid": { 00:14:23.790 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:23.790 "strip_size_kb": 64, 00:14:23.790 "state": "online", 00:14:23.790 "raid_level": "concat", 00:14:23.790 "superblock": true, 00:14:23.790 "num_base_bdevs": 3, 00:14:23.790 "num_base_bdevs_discovered": 3, 00:14:23.790 "num_base_bdevs_operational": 3, 00:14:23.790 "base_bdevs_list": [ 00:14:23.790 { 00:14:23.790 "name": "pt1", 00:14:23.790 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:23.790 "is_configured": true, 00:14:23.790 "data_offset": 2048, 00:14:23.790 "data_size": 63488 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "name": "pt2", 00:14:23.790 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:23.790 "is_configured": true, 00:14:23.790 "data_offset": 2048, 00:14:23.790 "data_size": 63488 00:14:23.790 }, 00:14:23.790 { 00:14:23.790 "name": "pt3", 00:14:23.790 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:23.790 "is_configured": true, 00:14:23.790 "data_offset": 2048, 00:14:23.790 "data_size": 63488 00:14:23.790 } 00:14:23.790 ] 00:14:23.790 } 00:14:23.790 } 00:14:23.790 }' 00:14:23.790 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:23.790 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:23.790 pt2 00:14:23.790 pt3' 00:14:23.790 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:23.790 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:23.790 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.114 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.114 "name": "pt1", 00:14:24.114 "aliases": [ 00:14:24.114 "00000000-0000-0000-0000-000000000001" 00:14:24.114 ], 00:14:24.114 "product_name": "passthru", 00:14:24.114 "block_size": 512, 00:14:24.114 "num_blocks": 65536, 00:14:24.114 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:24.114 "assigned_rate_limits": { 00:14:24.114 "rw_ios_per_sec": 0, 00:14:24.114 "rw_mbytes_per_sec": 0, 00:14:24.114 "r_mbytes_per_sec": 0, 00:14:24.114 "w_mbytes_per_sec": 0 00:14:24.114 }, 00:14:24.114 "claimed": true, 00:14:24.114 "claim_type": "exclusive_write", 00:14:24.114 "zoned": false, 00:14:24.114 "supported_io_types": { 00:14:24.114 "read": true, 00:14:24.114 "write": true, 00:14:24.114 "unmap": true, 00:14:24.114 "write_zeroes": true, 00:14:24.114 "flush": true, 00:14:24.114 "reset": true, 00:14:24.114 "compare": false, 00:14:24.114 "compare_and_write": false, 00:14:24.114 "abort": true, 00:14:24.114 "nvme_admin": false, 00:14:24.114 "nvme_io": false 00:14:24.114 }, 00:14:24.114 "memory_domains": [ 00:14:24.114 { 00:14:24.114 "dma_device_id": "system", 00:14:24.114 "dma_device_type": 1 00:14:24.114 }, 00:14:24.115 { 00:14:24.115 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.115 "dma_device_type": 2 00:14:24.115 } 00:14:24.115 ], 00:14:24.115 "driver_specific": { 00:14:24.115 "passthru": { 00:14:24.115 "name": "pt1", 00:14:24.115 "base_bdev_name": "malloc1" 00:14:24.115 } 00:14:24.115 } 00:14:24.115 }' 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.115 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:24.374 "name": "pt2", 00:14:24.374 "aliases": [ 00:14:24.374 "00000000-0000-0000-0000-000000000002" 00:14:24.374 ], 00:14:24.374 "product_name": "passthru", 00:14:24.374 "block_size": 512, 00:14:24.374 "num_blocks": 65536, 00:14:24.374 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:24.374 "assigned_rate_limits": { 00:14:24.374 "rw_ios_per_sec": 0, 00:14:24.374 "rw_mbytes_per_sec": 0, 00:14:24.374 "r_mbytes_per_sec": 0, 00:14:24.374 "w_mbytes_per_sec": 0 00:14:24.374 }, 00:14:24.374 "claimed": true, 00:14:24.374 "claim_type": "exclusive_write", 00:14:24.374 "zoned": false, 00:14:24.374 "supported_io_types": { 00:14:24.374 "read": true, 00:14:24.374 "write": true, 00:14:24.374 "unmap": true, 00:14:24.374 "write_zeroes": true, 00:14:24.374 "flush": true, 00:14:24.374 "reset": true, 00:14:24.374 "compare": false, 00:14:24.374 "compare_and_write": false, 00:14:24.374 "abort": true, 00:14:24.374 "nvme_admin": false, 00:14:24.374 "nvme_io": false 00:14:24.374 }, 00:14:24.374 "memory_domains": [ 00:14:24.374 { 00:14:24.374 "dma_device_id": "system", 00:14:24.374 "dma_device_type": 1 00:14:24.374 }, 00:14:24.374 { 00:14:24.374 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:24.374 "dma_device_type": 2 00:14:24.374 } 00:14:24.374 ], 00:14:24.374 "driver_specific": { 00:14:24.374 "passthru": { 00:14:24.374 "name": "pt2", 00:14:24.374 "base_bdev_name": "malloc2" 00:14:24.374 } 00:14:24.374 } 00:14:24.374 }' 00:14:24.374 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.633 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:24.633 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:24.633 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.633 15:53:29 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:24.633 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:24.633 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.633 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:24.633 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:24.633 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.893 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:24.893 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:24.893 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:24.893 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:24.893 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:25.152 "name": "pt3", 00:14:25.152 "aliases": [ 00:14:25.152 "00000000-0000-0000-0000-000000000003" 00:14:25.152 ], 00:14:25.152 "product_name": "passthru", 00:14:25.152 "block_size": 512, 00:14:25.152 "num_blocks": 65536, 00:14:25.152 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:25.152 "assigned_rate_limits": { 00:14:25.152 "rw_ios_per_sec": 0, 00:14:25.152 "rw_mbytes_per_sec": 0, 00:14:25.152 "r_mbytes_per_sec": 0, 00:14:25.152 "w_mbytes_per_sec": 0 00:14:25.152 }, 00:14:25.152 "claimed": true, 00:14:25.152 "claim_type": "exclusive_write", 00:14:25.152 "zoned": false, 00:14:25.152 "supported_io_types": { 00:14:25.152 "read": true, 00:14:25.152 "write": true, 00:14:25.152 "unmap": true, 00:14:25.152 "write_zeroes": true, 00:14:25.152 "flush": true, 00:14:25.152 "reset": true, 00:14:25.152 "compare": false, 00:14:25.152 "compare_and_write": false, 00:14:25.152 "abort": true, 00:14:25.152 "nvme_admin": false, 00:14:25.152 "nvme_io": false 00:14:25.152 }, 00:14:25.152 "memory_domains": [ 00:14:25.152 { 00:14:25.152 "dma_device_id": "system", 00:14:25.152 "dma_device_type": 1 00:14:25.152 }, 00:14:25.152 { 00:14:25.152 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:25.152 "dma_device_type": 2 00:14:25.152 } 00:14:25.152 ], 00:14:25.152 "driver_specific": { 00:14:25.152 "passthru": { 00:14:25.152 "name": "pt3", 00:14:25.152 "base_bdev_name": "malloc3" 00:14:25.152 } 00:14:25.152 } 00:14:25.152 }' 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.152 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:14:25.411 15:53:30 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:25.670 [2024-06-10 15:53:31.067303] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:25.670 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=8dae06cb-7ace-4b95-9a8e-c7bbf97888b7 00:14:25.670 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 8dae06cb-7ace-4b95-9a8e-c7bbf97888b7 ']' 00:14:25.670 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:25.929 [2024-06-10 15:53:31.323725] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:25.929 [2024-06-10 15:53:31.323741] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:25.929 [2024-06-10 15:53:31.323789] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:25.929 [2024-06-10 15:53:31.323840] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:25.929 [2024-06-10 15:53:31.323849] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x25209f0 name raid_bdev1, state offline 00:14:25.929 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:25.929 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:14:26.189 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:14:26.189 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:14:26.189 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:26.189 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:14:26.448 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:26.448 15:53:31 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:26.718 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:14:26.718 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:14:26.980 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:14:26.980 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:14:27.239 [2024-06-10 15:53:32.719385] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:14:27.239 [2024-06-10 15:53:32.720810] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:14:27.239 [2024-06-10 15:53:32.720853] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:14:27.239 [2024-06-10 15:53:32.720898] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:14:27.239 [2024-06-10 15:53:32.720933] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:14:27.239 [2024-06-10 15:53:32.720954] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:14:27.239 [2024-06-10 15:53:32.720983] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:27.239 [2024-06-10 15:53:32.720992] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x251e3a0 name raid_bdev1, state configuring 00:14:27.239 request: 00:14:27.239 { 00:14:27.239 "name": "raid_bdev1", 00:14:27.239 "raid_level": "concat", 00:14:27.239 "base_bdevs": [ 00:14:27.239 "malloc1", 00:14:27.239 "malloc2", 00:14:27.239 "malloc3" 00:14:27.239 ], 00:14:27.239 "superblock": false, 00:14:27.239 "strip_size_kb": 64, 00:14:27.239 "method": "bdev_raid_create", 00:14:27.239 "req_id": 1 00:14:27.239 } 00:14:27.239 Got JSON-RPC error response 00:14:27.239 response: 00:14:27.239 { 00:14:27.239 "code": -17, 00:14:27.239 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:14:27.239 } 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.239 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:14:27.498 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:14:27.498 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:14:27.498 15:53:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:14:27.757 [2024-06-10 15:53:33.220655] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:14:27.757 [2024-06-10 15:53:33.220699] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:27.757 [2024-06-10 15:53:33.220714] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2524320 00:14:27.757 [2024-06-10 15:53:33.220724] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:27.757 [2024-06-10 15:53:33.222385] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:27.757 [2024-06-10 15:53:33.222416] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:14:27.757 [2024-06-10 15:53:33.222479] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:14:27.757 [2024-06-10 15:53:33.222504] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:14:27.757 pt1 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:27.757 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:28.016 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:28.016 "name": "raid_bdev1", 00:14:28.016 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:28.016 "strip_size_kb": 64, 00:14:28.016 "state": "configuring", 00:14:28.016 "raid_level": "concat", 00:14:28.016 "superblock": true, 00:14:28.016 "num_base_bdevs": 3, 00:14:28.016 "num_base_bdevs_discovered": 1, 00:14:28.016 "num_base_bdevs_operational": 3, 00:14:28.016 "base_bdevs_list": [ 00:14:28.016 { 00:14:28.016 "name": "pt1", 00:14:28.016 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:28.016 "is_configured": true, 00:14:28.016 "data_offset": 2048, 00:14:28.016 "data_size": 63488 00:14:28.016 }, 00:14:28.016 { 00:14:28.016 "name": null, 00:14:28.016 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:28.016 "is_configured": false, 00:14:28.016 "data_offset": 2048, 00:14:28.016 "data_size": 63488 00:14:28.016 }, 00:14:28.016 { 00:14:28.016 "name": null, 00:14:28.016 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:28.016 "is_configured": false, 00:14:28.016 "data_offset": 2048, 00:14:28.016 "data_size": 63488 00:14:28.016 } 00:14:28.016 ] 00:14:28.016 }' 00:14:28.016 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:28.016 15:53:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:28.584 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:14:28.584 15:53:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:28.843 [2024-06-10 15:53:34.179241] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:28.843 [2024-06-10 15:53:34.179287] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:28.843 [2024-06-10 15:53:34.179303] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2523240 00:14:28.843 [2024-06-10 15:53:34.179313] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:28.843 [2024-06-10 15:53:34.179654] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:28.843 [2024-06-10 15:53:34.179670] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:28.843 [2024-06-10 15:53:34.179729] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:28.843 [2024-06-10 15:53:34.179747] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:28.843 pt2 00:14:28.843 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:14:29.102 [2024-06-10 15:53:34.431928] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 3 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:29.102 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:29.362 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:29.362 "name": "raid_bdev1", 00:14:29.362 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:29.362 "strip_size_kb": 64, 00:14:29.362 "state": "configuring", 00:14:29.362 "raid_level": "concat", 00:14:29.362 "superblock": true, 00:14:29.362 "num_base_bdevs": 3, 00:14:29.362 "num_base_bdevs_discovered": 1, 00:14:29.362 "num_base_bdevs_operational": 3, 00:14:29.362 "base_bdevs_list": [ 00:14:29.362 { 00:14:29.362 "name": "pt1", 00:14:29.362 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:29.362 "is_configured": true, 00:14:29.362 "data_offset": 2048, 00:14:29.362 "data_size": 63488 00:14:29.362 }, 00:14:29.362 { 00:14:29.362 "name": null, 00:14:29.362 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:29.362 "is_configured": false, 00:14:29.362 "data_offset": 2048, 00:14:29.362 "data_size": 63488 00:14:29.362 }, 00:14:29.362 { 00:14:29.362 "name": null, 00:14:29.362 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:29.362 "is_configured": false, 00:14:29.362 "data_offset": 2048, 00:14:29.362 "data_size": 63488 00:14:29.362 } 00:14:29.362 ] 00:14:29.362 }' 00:14:29.362 15:53:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:29.362 15:53:34 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:29.930 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:14:29.930 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:29.930 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:14:30.189 [2024-06-10 15:53:35.570972] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:14:30.189 [2024-06-10 15:53:35.571021] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.189 [2024-06-10 15:53:35.571037] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x25207c0 00:14:30.189 [2024-06-10 15:53:35.571047] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.189 [2024-06-10 15:53:35.571380] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.189 [2024-06-10 15:53:35.571395] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:14:30.189 [2024-06-10 15:53:35.571455] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:14:30.189 [2024-06-10 15:53:35.571473] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:14:30.189 pt2 00:14:30.189 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:30.189 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:30.189 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:14:30.447 [2024-06-10 15:53:35.747438] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:14:30.448 [2024-06-10 15:53:35.747471] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:30.448 [2024-06-10 15:53:35.747486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2526510 00:14:30.448 [2024-06-10 15:53:35.747495] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:30.448 [2024-06-10 15:53:35.747787] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:30.448 [2024-06-10 15:53:35.747802] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:14:30.448 [2024-06-10 15:53:35.747851] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:14:30.448 [2024-06-10 15:53:35.747868] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:14:30.448 [2024-06-10 15:53:35.747986] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2523560 00:14:30.448 [2024-06-10 15:53:35.747996] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:30.448 [2024-06-10 15:53:35.748172] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2526240 00:14:30.448 [2024-06-10 15:53:35.748300] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2523560 00:14:30.448 [2024-06-10 15:53:35.748308] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x2523560 00:14:30.448 [2024-06-10 15:53:35.748404] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:30.448 pt3 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:30.448 15:53:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:30.706 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:30.707 "name": "raid_bdev1", 00:14:30.707 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:30.707 "strip_size_kb": 64, 00:14:30.707 "state": "online", 00:14:30.707 "raid_level": "concat", 00:14:30.707 "superblock": true, 00:14:30.707 "num_base_bdevs": 3, 00:14:30.707 "num_base_bdevs_discovered": 3, 00:14:30.707 "num_base_bdevs_operational": 3, 00:14:30.707 "base_bdevs_list": [ 00:14:30.707 { 00:14:30.707 "name": "pt1", 00:14:30.707 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:30.707 "is_configured": true, 00:14:30.707 "data_offset": 2048, 00:14:30.707 "data_size": 63488 00:14:30.707 }, 00:14:30.707 { 00:14:30.707 "name": "pt2", 00:14:30.707 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:30.707 "is_configured": true, 00:14:30.707 "data_offset": 2048, 00:14:30.707 "data_size": 63488 00:14:30.707 }, 00:14:30.707 { 00:14:30.707 "name": "pt3", 00:14:30.707 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:30.707 "is_configured": true, 00:14:30.707 "data_offset": 2048, 00:14:30.707 "data_size": 63488 00:14:30.707 } 00:14:30.707 ] 00:14:30.707 }' 00:14:30.707 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:30.707 15:53:36 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:31.274 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:31.533 [2024-06-10 15:53:36.898799] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:31.533 "name": "raid_bdev1", 00:14:31.533 "aliases": [ 00:14:31.533 "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7" 00:14:31.533 ], 00:14:31.533 "product_name": "Raid Volume", 00:14:31.533 "block_size": 512, 00:14:31.533 "num_blocks": 190464, 00:14:31.533 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:31.533 "assigned_rate_limits": { 00:14:31.533 "rw_ios_per_sec": 0, 00:14:31.533 "rw_mbytes_per_sec": 0, 00:14:31.533 "r_mbytes_per_sec": 0, 00:14:31.533 "w_mbytes_per_sec": 0 00:14:31.533 }, 00:14:31.533 "claimed": false, 00:14:31.533 "zoned": false, 00:14:31.533 "supported_io_types": { 00:14:31.533 "read": true, 00:14:31.533 "write": true, 00:14:31.533 "unmap": true, 00:14:31.533 "write_zeroes": true, 00:14:31.533 "flush": true, 00:14:31.533 "reset": true, 00:14:31.533 "compare": false, 00:14:31.533 "compare_and_write": false, 00:14:31.533 "abort": false, 00:14:31.533 "nvme_admin": false, 00:14:31.533 "nvme_io": false 00:14:31.533 }, 00:14:31.533 "memory_domains": [ 00:14:31.533 { 00:14:31.533 "dma_device_id": "system", 00:14:31.533 "dma_device_type": 1 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.533 "dma_device_type": 2 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "dma_device_id": "system", 00:14:31.533 "dma_device_type": 1 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.533 "dma_device_type": 2 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "dma_device_id": "system", 00:14:31.533 "dma_device_type": 1 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.533 "dma_device_type": 2 00:14:31.533 } 00:14:31.533 ], 00:14:31.533 "driver_specific": { 00:14:31.533 "raid": { 00:14:31.533 "uuid": "8dae06cb-7ace-4b95-9a8e-c7bbf97888b7", 00:14:31.533 "strip_size_kb": 64, 00:14:31.533 "state": "online", 00:14:31.533 "raid_level": "concat", 00:14:31.533 "superblock": true, 00:14:31.533 "num_base_bdevs": 3, 00:14:31.533 "num_base_bdevs_discovered": 3, 00:14:31.533 "num_base_bdevs_operational": 3, 00:14:31.533 "base_bdevs_list": [ 00:14:31.533 { 00:14:31.533 "name": "pt1", 00:14:31.533 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:31.533 "is_configured": true, 00:14:31.533 "data_offset": 2048, 00:14:31.533 "data_size": 63488 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "name": "pt2", 00:14:31.533 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:31.533 "is_configured": true, 00:14:31.533 "data_offset": 2048, 00:14:31.533 "data_size": 63488 00:14:31.533 }, 00:14:31.533 { 00:14:31.533 "name": "pt3", 00:14:31.533 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:31.533 "is_configured": true, 00:14:31.533 "data_offset": 2048, 00:14:31.533 "data_size": 63488 00:14:31.533 } 00:14:31.533 ] 00:14:31.533 } 00:14:31.533 } 00:14:31.533 }' 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:14:31.533 pt2 00:14:31.533 pt3' 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:14:31.533 15:53:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:31.793 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:31.793 "name": "pt1", 00:14:31.793 "aliases": [ 00:14:31.793 "00000000-0000-0000-0000-000000000001" 00:14:31.793 ], 00:14:31.793 "product_name": "passthru", 00:14:31.793 "block_size": 512, 00:14:31.793 "num_blocks": 65536, 00:14:31.793 "uuid": "00000000-0000-0000-0000-000000000001", 00:14:31.793 "assigned_rate_limits": { 00:14:31.793 "rw_ios_per_sec": 0, 00:14:31.793 "rw_mbytes_per_sec": 0, 00:14:31.793 "r_mbytes_per_sec": 0, 00:14:31.793 "w_mbytes_per_sec": 0 00:14:31.793 }, 00:14:31.793 "claimed": true, 00:14:31.793 "claim_type": "exclusive_write", 00:14:31.793 "zoned": false, 00:14:31.793 "supported_io_types": { 00:14:31.793 "read": true, 00:14:31.793 "write": true, 00:14:31.793 "unmap": true, 00:14:31.793 "write_zeroes": true, 00:14:31.793 "flush": true, 00:14:31.793 "reset": true, 00:14:31.793 "compare": false, 00:14:31.793 "compare_and_write": false, 00:14:31.793 "abort": true, 00:14:31.793 "nvme_admin": false, 00:14:31.793 "nvme_io": false 00:14:31.793 }, 00:14:31.793 "memory_domains": [ 00:14:31.793 { 00:14:31.793 "dma_device_id": "system", 00:14:31.793 "dma_device_type": 1 00:14:31.793 }, 00:14:31.793 { 00:14:31.793 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:31.793 "dma_device_type": 2 00:14:31.793 } 00:14:31.793 ], 00:14:31.793 "driver_specific": { 00:14:31.793 "passthru": { 00:14:31.793 "name": "pt1", 00:14:31.793 "base_bdev_name": "malloc1" 00:14:31.793 } 00:14:31.793 } 00:14:31.793 }' 00:14:31.793 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:31.793 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.052 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.311 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.311 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.311 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:14:32.311 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:32.570 "name": "pt2", 00:14:32.570 "aliases": [ 00:14:32.570 "00000000-0000-0000-0000-000000000002" 00:14:32.570 ], 00:14:32.570 "product_name": "passthru", 00:14:32.570 "block_size": 512, 00:14:32.570 "num_blocks": 65536, 00:14:32.570 "uuid": "00000000-0000-0000-0000-000000000002", 00:14:32.570 "assigned_rate_limits": { 00:14:32.570 "rw_ios_per_sec": 0, 00:14:32.570 "rw_mbytes_per_sec": 0, 00:14:32.570 "r_mbytes_per_sec": 0, 00:14:32.570 "w_mbytes_per_sec": 0 00:14:32.570 }, 00:14:32.570 "claimed": true, 00:14:32.570 "claim_type": "exclusive_write", 00:14:32.570 "zoned": false, 00:14:32.570 "supported_io_types": { 00:14:32.570 "read": true, 00:14:32.570 "write": true, 00:14:32.570 "unmap": true, 00:14:32.570 "write_zeroes": true, 00:14:32.570 "flush": true, 00:14:32.570 "reset": true, 00:14:32.570 "compare": false, 00:14:32.570 "compare_and_write": false, 00:14:32.570 "abort": true, 00:14:32.570 "nvme_admin": false, 00:14:32.570 "nvme_io": false 00:14:32.570 }, 00:14:32.570 "memory_domains": [ 00:14:32.570 { 00:14:32.570 "dma_device_id": "system", 00:14:32.570 "dma_device_type": 1 00:14:32.570 }, 00:14:32.570 { 00:14:32.570 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:32.570 "dma_device_type": 2 00:14:32.570 } 00:14:32.570 ], 00:14:32.570 "driver_specific": { 00:14:32.570 "passthru": { 00:14:32.570 "name": "pt2", 00:14:32.570 "base_bdev_name": "malloc2" 00:14:32.570 } 00:14:32.570 } 00:14:32.570 }' 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.570 15:53:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:32.570 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:32.570 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.570 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:14:32.829 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:33.088 "name": "pt3", 00:14:33.088 "aliases": [ 00:14:33.088 "00000000-0000-0000-0000-000000000003" 00:14:33.088 ], 00:14:33.088 "product_name": "passthru", 00:14:33.088 "block_size": 512, 00:14:33.088 "num_blocks": 65536, 00:14:33.088 "uuid": "00000000-0000-0000-0000-000000000003", 00:14:33.088 "assigned_rate_limits": { 00:14:33.088 "rw_ios_per_sec": 0, 00:14:33.088 "rw_mbytes_per_sec": 0, 00:14:33.088 "r_mbytes_per_sec": 0, 00:14:33.088 "w_mbytes_per_sec": 0 00:14:33.088 }, 00:14:33.088 "claimed": true, 00:14:33.088 "claim_type": "exclusive_write", 00:14:33.088 "zoned": false, 00:14:33.088 "supported_io_types": { 00:14:33.088 "read": true, 00:14:33.088 "write": true, 00:14:33.088 "unmap": true, 00:14:33.088 "write_zeroes": true, 00:14:33.088 "flush": true, 00:14:33.088 "reset": true, 00:14:33.088 "compare": false, 00:14:33.088 "compare_and_write": false, 00:14:33.088 "abort": true, 00:14:33.088 "nvme_admin": false, 00:14:33.088 "nvme_io": false 00:14:33.088 }, 00:14:33.088 "memory_domains": [ 00:14:33.088 { 00:14:33.088 "dma_device_id": "system", 00:14:33.088 "dma_device_type": 1 00:14:33.088 }, 00:14:33.088 { 00:14:33.088 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:33.088 "dma_device_type": 2 00:14:33.088 } 00:14:33.088 ], 00:14:33.088 "driver_specific": { 00:14:33.088 "passthru": { 00:14:33.088 "name": "pt3", 00:14:33.088 "base_bdev_name": "malloc3" 00:14:33.088 } 00:14:33.088 } 00:14:33.088 }' 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.088 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:14:33.347 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:14:33.606 [2024-06-10 15:53:38.952321] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 8dae06cb-7ace-4b95-9a8e-c7bbf97888b7 '!=' 8dae06cb-7ace-4b95-9a8e-c7bbf97888b7 ']' 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2683811 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2683811 ']' 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2683811 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:33.606 15:53:38 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2683811 00:14:33.606 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:33.606 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:33.606 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2683811' 00:14:33.606 killing process with pid 2683811 00:14:33.606 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2683811 00:14:33.606 [2024-06-10 15:53:39.015100] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:33.606 [2024-06-10 15:53:39.015156] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:33.606 [2024-06-10 15:53:39.015210] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:33.606 [2024-06-10 15:53:39.015218] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2523560 name raid_bdev1, state offline 00:14:33.606 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2683811 00:14:33.606 [2024-06-10 15:53:39.040500] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:33.866 15:53:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:14:33.866 00:14:33.866 real 0m13.901s 00:14:33.866 user 0m25.542s 00:14:33.866 sys 0m2.011s 00:14:33.866 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:33.866 15:53:39 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.866 ************************************ 00:14:33.866 END TEST raid_superblock_test 00:14:33.867 ************************************ 00:14:33.867 15:53:39 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 3 read 00:14:33.867 15:53:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:33.867 15:53:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:33.867 15:53:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:33.867 ************************************ 00:14:33.867 START TEST raid_read_error_test 00:14:33.867 ************************************ 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 read 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.NPG358LumH 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2686380 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2686380 /var/tmp/spdk-raid.sock 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2686380 ']' 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:33.867 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:33.867 15:53:39 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:33.867 [2024-06-10 15:53:39.370161] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:14:33.867 [2024-06-10 15:53:39.370213] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2686380 ] 00:14:34.126 [2024-06-10 15:53:39.469456] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:34.126 [2024-06-10 15:53:39.564239] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:34.126 [2024-06-10 15:53:39.621625] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:34.126 [2024-06-10 15:53:39.621654] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:35.063 15:53:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:35.063 15:53:40 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:35.063 15:53:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:35.063 15:53:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:35.063 BaseBdev1_malloc 00:14:35.322 15:53:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:35.322 true 00:14:35.581 15:53:40 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:35.581 [2024-06-10 15:53:41.067227] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:35.581 [2024-06-10 15:53:41.067266] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:35.581 [2024-06-10 15:53:41.067286] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1253150 00:14:35.581 [2024-06-10 15:53:41.067296] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:35.581 [2024-06-10 15:53:41.069132] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:35.581 [2024-06-10 15:53:41.069162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:35.581 BaseBdev1 00:14:35.581 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:35.581 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:35.840 BaseBdev2_malloc 00:14:35.840 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:36.098 true 00:14:36.098 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:36.358 [2024-06-10 15:53:41.825856] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:36.358 [2024-06-10 15:53:41.825896] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:36.358 [2024-06-10 15:53:41.825915] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1257b50 00:14:36.358 [2024-06-10 15:53:41.825925] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:36.358 [2024-06-10 15:53:41.827514] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:36.358 [2024-06-10 15:53:41.827542] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:36.358 BaseBdev2 00:14:36.358 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:36.358 15:53:41 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:36.617 BaseBdev3_malloc 00:14:36.617 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:36.876 true 00:14:36.876 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:37.134 [2024-06-10 15:53:42.508149] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:37.134 [2024-06-10 15:53:42.508185] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:37.134 [2024-06-10 15:53:42.508204] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1258780 00:14:37.134 [2024-06-10 15:53:42.508214] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:37.134 [2024-06-10 15:53:42.509740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:37.134 [2024-06-10 15:53:42.509766] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:37.134 BaseBdev3 00:14:37.134 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:37.393 [2024-06-10 15:53:42.748814] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:37.393 [2024-06-10 15:53:42.750128] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:37.393 [2024-06-10 15:53:42.750199] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:37.393 [2024-06-10 15:53:42.750404] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x125b5a0 00:14:37.393 [2024-06-10 15:53:42.750415] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:37.393 [2024-06-10 15:53:42.750606] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1254b60 00:14:37.393 [2024-06-10 15:53:42.750760] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x125b5a0 00:14:37.393 [2024-06-10 15:53:42.750769] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x125b5a0 00:14:37.393 [2024-06-10 15:53:42.750871] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:37.393 15:53:42 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:37.652 15:53:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:37.652 "name": "raid_bdev1", 00:14:37.652 "uuid": "49c15b12-e1bd-4164-a0c3-9a1f338528a5", 00:14:37.652 "strip_size_kb": 64, 00:14:37.652 "state": "online", 00:14:37.652 "raid_level": "concat", 00:14:37.652 "superblock": true, 00:14:37.652 "num_base_bdevs": 3, 00:14:37.652 "num_base_bdevs_discovered": 3, 00:14:37.652 "num_base_bdevs_operational": 3, 00:14:37.652 "base_bdevs_list": [ 00:14:37.652 { 00:14:37.652 "name": "BaseBdev1", 00:14:37.652 "uuid": "09f5993b-0d6c-52b5-a246-302ff63693d2", 00:14:37.652 "is_configured": true, 00:14:37.652 "data_offset": 2048, 00:14:37.652 "data_size": 63488 00:14:37.652 }, 00:14:37.652 { 00:14:37.652 "name": "BaseBdev2", 00:14:37.652 "uuid": "d0f186e7-7ed8-5e43-b952-385acc2fa83e", 00:14:37.652 "is_configured": true, 00:14:37.652 "data_offset": 2048, 00:14:37.652 "data_size": 63488 00:14:37.652 }, 00:14:37.652 { 00:14:37.652 "name": "BaseBdev3", 00:14:37.652 "uuid": "f45b431c-1516-526c-a2b0-a3f305c0a49e", 00:14:37.652 "is_configured": true, 00:14:37.652 "data_offset": 2048, 00:14:37.652 "data_size": 63488 00:14:37.652 } 00:14:37.652 ] 00:14:37.652 }' 00:14:37.652 15:53:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:37.652 15:53:43 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:38.252 15:53:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:38.252 15:53:43 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:38.252 [2024-06-10 15:53:43.747712] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdb0d50 00:14:39.188 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:39.447 15:53:44 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:39.706 15:53:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:39.706 "name": "raid_bdev1", 00:14:39.706 "uuid": "49c15b12-e1bd-4164-a0c3-9a1f338528a5", 00:14:39.706 "strip_size_kb": 64, 00:14:39.706 "state": "online", 00:14:39.706 "raid_level": "concat", 00:14:39.706 "superblock": true, 00:14:39.706 "num_base_bdevs": 3, 00:14:39.706 "num_base_bdevs_discovered": 3, 00:14:39.706 "num_base_bdevs_operational": 3, 00:14:39.706 "base_bdevs_list": [ 00:14:39.706 { 00:14:39.706 "name": "BaseBdev1", 00:14:39.706 "uuid": "09f5993b-0d6c-52b5-a246-302ff63693d2", 00:14:39.706 "is_configured": true, 00:14:39.706 "data_offset": 2048, 00:14:39.706 "data_size": 63488 00:14:39.706 }, 00:14:39.706 { 00:14:39.706 "name": "BaseBdev2", 00:14:39.706 "uuid": "d0f186e7-7ed8-5e43-b952-385acc2fa83e", 00:14:39.706 "is_configured": true, 00:14:39.706 "data_offset": 2048, 00:14:39.706 "data_size": 63488 00:14:39.706 }, 00:14:39.706 { 00:14:39.706 "name": "BaseBdev3", 00:14:39.706 "uuid": "f45b431c-1516-526c-a2b0-a3f305c0a49e", 00:14:39.706 "is_configured": true, 00:14:39.706 "data_offset": 2048, 00:14:39.706 "data_size": 63488 00:14:39.706 } 00:14:39.706 ] 00:14:39.706 }' 00:14:39.706 15:53:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:39.706 15:53:45 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.641 15:53:45 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:40.641 [2024-06-10 15:53:46.022668] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:40.641 [2024-06-10 15:53:46.022694] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:40.641 [2024-06-10 15:53:46.026108] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:40.641 [2024-06-10 15:53:46.026143] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:40.641 [2024-06-10 15:53:46.026178] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:40.641 [2024-06-10 15:53:46.026186] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125b5a0 name raid_bdev1, state offline 00:14:40.641 0 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2686380 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2686380 ']' 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2686380 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2686380 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2686380' 00:14:40.641 killing process with pid 2686380 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2686380 00:14:40.641 [2024-06-10 15:53:46.085456] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:40.641 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2686380 00:14:40.641 [2024-06-10 15:53:46.104962] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.NPG358LumH 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:14:40.900 00:14:40.900 real 0m7.015s 00:14:40.900 user 0m11.434s 00:14:40.900 sys 0m0.961s 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:40.900 15:53:46 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:40.900 ************************************ 00:14:40.900 END TEST raid_read_error_test 00:14:40.900 ************************************ 00:14:40.900 15:53:46 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 3 write 00:14:40.900 15:53:46 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:40.900 15:53:46 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:40.900 15:53:46 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:40.900 ************************************ 00:14:40.900 START TEST raid_write_error_test 00:14:40.900 ************************************ 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 3 write 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.6pzVqIlLhj 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2687617 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2687617 /var/tmp/spdk-raid.sock 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2687617 ']' 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:40.900 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:40.900 15:53:46 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:41.158 [2024-06-10 15:53:46.449497] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:14:41.158 [2024-06-10 15:53:46.449551] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2687617 ] 00:14:41.158 [2024-06-10 15:53:46.545882] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:41.158 [2024-06-10 15:53:46.639886] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:41.417 [2024-06-10 15:53:46.694465] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:41.417 [2024-06-10 15:53:46.694491] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:41.984 15:53:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:41.984 15:53:47 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:14:41.984 15:53:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:41.984 15:53:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:14:42.242 BaseBdev1_malloc 00:14:42.242 15:53:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:14:42.500 true 00:14:42.500 15:53:47 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:14:42.759 [2024-06-10 15:53:48.147513] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:14:42.759 [2024-06-10 15:53:48.147551] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:42.759 [2024-06-10 15:53:48.147572] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c0150 00:14:42.759 [2024-06-10 15:53:48.147581] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:42.759 [2024-06-10 15:53:48.149399] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:42.759 [2024-06-10 15:53:48.149427] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:14:42.759 BaseBdev1 00:14:42.759 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:42.759 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:14:43.017 BaseBdev2_malloc 00:14:43.017 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:14:43.275 true 00:14:43.276 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:14:43.534 [2024-06-10 15:53:48.906037] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:14:43.534 [2024-06-10 15:53:48.906079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:43.534 [2024-06-10 15:53:48.906100] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c4b50 00:14:43.534 [2024-06-10 15:53:48.906109] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:43.534 [2024-06-10 15:53:48.907705] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:43.534 [2024-06-10 15:53:48.907732] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:14:43.534 BaseBdev2 00:14:43.534 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:14:43.534 15:53:48 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:14:43.793 BaseBdev3_malloc 00:14:43.793 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:14:44.052 true 00:14:44.052 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:14:44.311 [2024-06-10 15:53:49.648567] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:14:44.311 [2024-06-10 15:53:49.648608] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:14:44.311 [2024-06-10 15:53:49.648627] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x16c5780 00:14:44.311 [2024-06-10 15:53:49.648637] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:14:44.311 [2024-06-10 15:53:49.650180] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:14:44.311 [2024-06-10 15:53:49.650208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:14:44.311 BaseBdev3 00:14:44.311 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:14:44.569 [2024-06-10 15:53:49.893242] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:44.569 [2024-06-10 15:53:49.894551] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:44.569 [2024-06-10 15:53:49.894621] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:44.569 [2024-06-10 15:53:49.894835] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x16c85a0 00:14:44.569 [2024-06-10 15:53:49.894847] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 190464, blocklen 512 00:14:44.569 [2024-06-10 15:53:49.895046] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16c1b60 00:14:44.569 [2024-06-10 15:53:49.895203] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x16c85a0 00:14:44.569 [2024-06-10 15:53:49.895212] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x16c85a0 00:14:44.569 [2024-06-10 15:53:49.895316] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:44.569 15:53:49 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:44.828 15:53:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:44.828 "name": "raid_bdev1", 00:14:44.828 "uuid": "53924ca1-74b5-4dfc-a28b-5f330a1fad0e", 00:14:44.828 "strip_size_kb": 64, 00:14:44.828 "state": "online", 00:14:44.828 "raid_level": "concat", 00:14:44.828 "superblock": true, 00:14:44.828 "num_base_bdevs": 3, 00:14:44.828 "num_base_bdevs_discovered": 3, 00:14:44.828 "num_base_bdevs_operational": 3, 00:14:44.828 "base_bdevs_list": [ 00:14:44.828 { 00:14:44.828 "name": "BaseBdev1", 00:14:44.828 "uuid": "90f74977-17a0-51dd-9fc4-42c6e6e973e2", 00:14:44.828 "is_configured": true, 00:14:44.828 "data_offset": 2048, 00:14:44.828 "data_size": 63488 00:14:44.828 }, 00:14:44.828 { 00:14:44.828 "name": "BaseBdev2", 00:14:44.828 "uuid": "cd83486c-d181-5738-9369-87d6fbcf2d4f", 00:14:44.828 "is_configured": true, 00:14:44.828 "data_offset": 2048, 00:14:44.828 "data_size": 63488 00:14:44.828 }, 00:14:44.828 { 00:14:44.828 "name": "BaseBdev3", 00:14:44.828 "uuid": "9795bad5-f969-51eb-8bfe-42fe0462886d", 00:14:44.828 "is_configured": true, 00:14:44.828 "data_offset": 2048, 00:14:44.828 "data_size": 63488 00:14:44.828 } 00:14:44.828 ] 00:14:44.828 }' 00:14:44.828 15:53:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:44.828 15:53:50 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:45.395 15:53:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:14:45.395 15:53:50 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:14:45.395 [2024-06-10 15:53:50.896167] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x121dd50 00:14:46.331 15:53:51 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 3 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:46.590 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:14:46.848 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:46.848 "name": "raid_bdev1", 00:14:46.848 "uuid": "53924ca1-74b5-4dfc-a28b-5f330a1fad0e", 00:14:46.848 "strip_size_kb": 64, 00:14:46.848 "state": "online", 00:14:46.848 "raid_level": "concat", 00:14:46.848 "superblock": true, 00:14:46.848 "num_base_bdevs": 3, 00:14:46.848 "num_base_bdevs_discovered": 3, 00:14:46.848 "num_base_bdevs_operational": 3, 00:14:46.848 "base_bdevs_list": [ 00:14:46.848 { 00:14:46.848 "name": "BaseBdev1", 00:14:46.848 "uuid": "90f74977-17a0-51dd-9fc4-42c6e6e973e2", 00:14:46.848 "is_configured": true, 00:14:46.848 "data_offset": 2048, 00:14:46.848 "data_size": 63488 00:14:46.848 }, 00:14:46.848 { 00:14:46.848 "name": "BaseBdev2", 00:14:46.848 "uuid": "cd83486c-d181-5738-9369-87d6fbcf2d4f", 00:14:46.848 "is_configured": true, 00:14:46.848 "data_offset": 2048, 00:14:46.848 "data_size": 63488 00:14:46.848 }, 00:14:46.848 { 00:14:46.848 "name": "BaseBdev3", 00:14:46.848 "uuid": "9795bad5-f969-51eb-8bfe-42fe0462886d", 00:14:46.848 "is_configured": true, 00:14:46.848 "data_offset": 2048, 00:14:46.848 "data_size": 63488 00:14:46.848 } 00:14:46.848 ] 00:14:46.848 }' 00:14:46.848 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:46.848 15:53:52 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:47.782 15:53:52 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:14:47.782 [2024-06-10 15:53:53.170757] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:14:47.782 [2024-06-10 15:53:53.170786] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:14:47.782 [2024-06-10 15:53:53.174218] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:14:47.782 [2024-06-10 15:53:53.174254] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:47.782 [2024-06-10 15:53:53.174289] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:14:47.782 [2024-06-10 15:53:53.174297] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x16c85a0 name raid_bdev1, state offline 00:14:47.782 0 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2687617 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2687617 ']' 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2687617 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2687617 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2687617' 00:14:47.782 killing process with pid 2687617 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2687617 00:14:47.782 [2024-06-10 15:53:53.234127] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:14:47.782 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2687617 00:14:47.783 [2024-06-10 15:53:53.253431] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.6pzVqIlLhj 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:14:48.042 00:14:48.042 real 0m7.087s 00:14:48.042 user 0m11.596s 00:14:48.042 sys 0m0.938s 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:14:48.042 15:53:53 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.042 ************************************ 00:14:48.042 END TEST raid_write_error_test 00:14:48.042 ************************************ 00:14:48.042 15:53:53 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:14:48.042 15:53:53 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 3 false 00:14:48.042 15:53:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:14:48.042 15:53:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:14:48.042 15:53:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:14:48.042 ************************************ 00:14:48.042 START TEST raid_state_function_test 00:14:48.042 ************************************ 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 false 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2688845 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2688845' 00:14:48.042 Process raid pid: 2688845 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2688845 /var/tmp/spdk-raid.sock 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2688845 ']' 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:14:48.042 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:14:48.042 15:53:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:48.301 [2024-06-10 15:53:53.597934] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:14:48.301 [2024-06-10 15:53:53.598001] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:14:48.301 [2024-06-10 15:53:53.696155] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:14:48.301 [2024-06-10 15:53:53.791105] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:14:48.559 [2024-06-10 15:53:53.849231] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:48.559 [2024-06-10 15:53:53.849255] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:14:49.125 15:53:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:14:49.125 15:53:54 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:14:49.125 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:49.384 [2024-06-10 15:53:54.792229] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:49.384 [2024-06-10 15:53:54.792268] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:49.384 [2024-06-10 15:53:54.792277] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:49.384 [2024-06-10 15:53:54.792286] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:49.384 [2024-06-10 15:53:54.792293] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:49.384 [2024-06-10 15:53:54.792301] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:49.384 15:53:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:49.641 15:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:49.641 "name": "Existed_Raid", 00:14:49.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.641 "strip_size_kb": 0, 00:14:49.641 "state": "configuring", 00:14:49.641 "raid_level": "raid1", 00:14:49.641 "superblock": false, 00:14:49.641 "num_base_bdevs": 3, 00:14:49.641 "num_base_bdevs_discovered": 0, 00:14:49.641 "num_base_bdevs_operational": 3, 00:14:49.641 "base_bdevs_list": [ 00:14:49.641 { 00:14:49.641 "name": "BaseBdev1", 00:14:49.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.641 "is_configured": false, 00:14:49.641 "data_offset": 0, 00:14:49.641 "data_size": 0 00:14:49.641 }, 00:14:49.641 { 00:14:49.641 "name": "BaseBdev2", 00:14:49.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.641 "is_configured": false, 00:14:49.641 "data_offset": 0, 00:14:49.641 "data_size": 0 00:14:49.641 }, 00:14:49.641 { 00:14:49.641 "name": "BaseBdev3", 00:14:49.641 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:49.641 "is_configured": false, 00:14:49.641 "data_offset": 0, 00:14:49.641 "data_size": 0 00:14:49.641 } 00:14:49.641 ] 00:14:49.641 }' 00:14:49.641 15:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:49.641 15:53:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:50.208 15:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:50.467 [2024-06-10 15:53:55.935145] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:50.467 [2024-06-10 15:53:55.935173] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c5120 name Existed_Raid, state configuring 00:14:50.467 15:53:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:50.725 [2024-06-10 15:53:56.191831] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:14:50.725 [2024-06-10 15:53:56.191855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:14:50.725 [2024-06-10 15:53:56.191863] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:50.725 [2024-06-10 15:53:56.191871] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:50.725 [2024-06-10 15:53:56.191878] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:50.725 [2024-06-10 15:53:56.191886] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:50.725 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:14:50.983 [2024-06-10 15:53:56.453993] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:50.983 BaseBdev1 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:50.983 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:51.242 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:14:51.500 [ 00:14:51.500 { 00:14:51.500 "name": "BaseBdev1", 00:14:51.500 "aliases": [ 00:14:51.500 "754c8f7f-2197-4e99-ac8a-4a1f4153318a" 00:14:51.500 ], 00:14:51.501 "product_name": "Malloc disk", 00:14:51.501 "block_size": 512, 00:14:51.501 "num_blocks": 65536, 00:14:51.501 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:51.501 "assigned_rate_limits": { 00:14:51.501 "rw_ios_per_sec": 0, 00:14:51.501 "rw_mbytes_per_sec": 0, 00:14:51.501 "r_mbytes_per_sec": 0, 00:14:51.501 "w_mbytes_per_sec": 0 00:14:51.501 }, 00:14:51.501 "claimed": true, 00:14:51.501 "claim_type": "exclusive_write", 00:14:51.501 "zoned": false, 00:14:51.501 "supported_io_types": { 00:14:51.501 "read": true, 00:14:51.501 "write": true, 00:14:51.501 "unmap": true, 00:14:51.501 "write_zeroes": true, 00:14:51.501 "flush": true, 00:14:51.501 "reset": true, 00:14:51.501 "compare": false, 00:14:51.501 "compare_and_write": false, 00:14:51.501 "abort": true, 00:14:51.501 "nvme_admin": false, 00:14:51.501 "nvme_io": false 00:14:51.501 }, 00:14:51.501 "memory_domains": [ 00:14:51.501 { 00:14:51.501 "dma_device_id": "system", 00:14:51.501 "dma_device_type": 1 00:14:51.501 }, 00:14:51.501 { 00:14:51.501 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:51.501 "dma_device_type": 2 00:14:51.501 } 00:14:51.501 ], 00:14:51.501 "driver_specific": {} 00:14:51.501 } 00:14:51.501 ] 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:51.501 15:53:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:51.759 15:53:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:51.759 "name": "Existed_Raid", 00:14:51.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.759 "strip_size_kb": 0, 00:14:51.759 "state": "configuring", 00:14:51.759 "raid_level": "raid1", 00:14:51.759 "superblock": false, 00:14:51.759 "num_base_bdevs": 3, 00:14:51.759 "num_base_bdevs_discovered": 1, 00:14:51.759 "num_base_bdevs_operational": 3, 00:14:51.759 "base_bdevs_list": [ 00:14:51.759 { 00:14:51.759 "name": "BaseBdev1", 00:14:51.759 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:51.759 "is_configured": true, 00:14:51.759 "data_offset": 0, 00:14:51.759 "data_size": 65536 00:14:51.759 }, 00:14:51.759 { 00:14:51.759 "name": "BaseBdev2", 00:14:51.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.759 "is_configured": false, 00:14:51.759 "data_offset": 0, 00:14:51.759 "data_size": 0 00:14:51.759 }, 00:14:51.759 { 00:14:51.759 "name": "BaseBdev3", 00:14:51.759 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:51.759 "is_configured": false, 00:14:51.759 "data_offset": 0, 00:14:51.759 "data_size": 0 00:14:51.759 } 00:14:51.759 ] 00:14:51.759 }' 00:14:51.759 15:53:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:51.759 15:53:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:52.694 15:53:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:14:52.694 [2024-06-10 15:53:58.110433] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:14:52.694 [2024-06-10 15:53:58.110471] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c49b0 name Existed_Raid, state configuring 00:14:52.694 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:14:53.014 [2024-06-10 15:53:58.367144] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:14:53.014 [2024-06-10 15:53:58.368699] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:14:53.014 [2024-06-10 15:53:58.368737] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:14:53.014 [2024-06-10 15:53:58.368746] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:14:53.014 [2024-06-10 15:53:58.368754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:53.014 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:53.272 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:53.272 "name": "Existed_Raid", 00:14:53.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.272 "strip_size_kb": 0, 00:14:53.272 "state": "configuring", 00:14:53.272 "raid_level": "raid1", 00:14:53.272 "superblock": false, 00:14:53.272 "num_base_bdevs": 3, 00:14:53.272 "num_base_bdevs_discovered": 1, 00:14:53.272 "num_base_bdevs_operational": 3, 00:14:53.272 "base_bdevs_list": [ 00:14:53.272 { 00:14:53.272 "name": "BaseBdev1", 00:14:53.272 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:53.272 "is_configured": true, 00:14:53.272 "data_offset": 0, 00:14:53.272 "data_size": 65536 00:14:53.272 }, 00:14:53.272 { 00:14:53.272 "name": "BaseBdev2", 00:14:53.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.272 "is_configured": false, 00:14:53.272 "data_offset": 0, 00:14:53.272 "data_size": 0 00:14:53.272 }, 00:14:53.272 { 00:14:53.272 "name": "BaseBdev3", 00:14:53.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:53.272 "is_configured": false, 00:14:53.272 "data_offset": 0, 00:14:53.272 "data_size": 0 00:14:53.272 } 00:14:53.272 ] 00:14:53.272 }' 00:14:53.272 15:53:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:53.272 15:53:58 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:53.838 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:14:54.095 [2024-06-10 15:53:59.405170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:14:54.095 BaseBdev2 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:54.095 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:14:54.354 [ 00:14:54.354 { 00:14:54.354 "name": "BaseBdev2", 00:14:54.354 "aliases": [ 00:14:54.354 "83492614-1d2c-4b61-bf4e-3a19f3172c42" 00:14:54.354 ], 00:14:54.354 "product_name": "Malloc disk", 00:14:54.354 "block_size": 512, 00:14:54.354 "num_blocks": 65536, 00:14:54.354 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:54.354 "assigned_rate_limits": { 00:14:54.354 "rw_ios_per_sec": 0, 00:14:54.354 "rw_mbytes_per_sec": 0, 00:14:54.354 "r_mbytes_per_sec": 0, 00:14:54.354 "w_mbytes_per_sec": 0 00:14:54.354 }, 00:14:54.354 "claimed": true, 00:14:54.354 "claim_type": "exclusive_write", 00:14:54.354 "zoned": false, 00:14:54.354 "supported_io_types": { 00:14:54.354 "read": true, 00:14:54.354 "write": true, 00:14:54.354 "unmap": true, 00:14:54.354 "write_zeroes": true, 00:14:54.354 "flush": true, 00:14:54.354 "reset": true, 00:14:54.354 "compare": false, 00:14:54.354 "compare_and_write": false, 00:14:54.354 "abort": true, 00:14:54.354 "nvme_admin": false, 00:14:54.354 "nvme_io": false 00:14:54.354 }, 00:14:54.354 "memory_domains": [ 00:14:54.354 { 00:14:54.354 "dma_device_id": "system", 00:14:54.354 "dma_device_type": 1 00:14:54.354 }, 00:14:54.354 { 00:14:54.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:54.354 "dma_device_type": 2 00:14:54.354 } 00:14:54.354 ], 00:14:54.354 "driver_specific": {} 00:14:54.354 } 00:14:54.354 ] 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:54.354 15:53:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:54.612 15:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:54.612 "name": "Existed_Raid", 00:14:54.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.612 "strip_size_kb": 0, 00:14:54.612 "state": "configuring", 00:14:54.612 "raid_level": "raid1", 00:14:54.612 "superblock": false, 00:14:54.612 "num_base_bdevs": 3, 00:14:54.612 "num_base_bdevs_discovered": 2, 00:14:54.612 "num_base_bdevs_operational": 3, 00:14:54.612 "base_bdevs_list": [ 00:14:54.612 { 00:14:54.612 "name": "BaseBdev1", 00:14:54.612 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:54.612 "is_configured": true, 00:14:54.612 "data_offset": 0, 00:14:54.612 "data_size": 65536 00:14:54.612 }, 00:14:54.612 { 00:14:54.612 "name": "BaseBdev2", 00:14:54.612 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:54.612 "is_configured": true, 00:14:54.612 "data_offset": 0, 00:14:54.612 "data_size": 65536 00:14:54.612 }, 00:14:54.612 { 00:14:54.612 "name": "BaseBdev3", 00:14:54.612 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:54.612 "is_configured": false, 00:14:54.612 "data_offset": 0, 00:14:54.612 "data_size": 0 00:14:54.612 } 00:14:54.612 ] 00:14:54.612 }' 00:14:54.612 15:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:54.612 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:14:55.548 [2024-06-10 15:54:00.960705] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:14:55.548 [2024-06-10 15:54:00.960744] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12c58c0 00:14:55.548 [2024-06-10 15:54:00.960751] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:14:55.548 [2024-06-10 15:54:00.960947] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf91d30 00:14:55.548 [2024-06-10 15:54:00.961086] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12c58c0 00:14:55.548 [2024-06-10 15:54:00.961095] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12c58c0 00:14:55.548 [2024-06-10 15:54:00.961261] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:14:55.548 BaseBdev3 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:14:55.548 15:54:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:14:55.807 15:54:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:14:56.064 [ 00:14:56.064 { 00:14:56.064 "name": "BaseBdev3", 00:14:56.065 "aliases": [ 00:14:56.065 "7cf6d91e-87de-43ea-abe1-d97d0e6813a0" 00:14:56.065 ], 00:14:56.065 "product_name": "Malloc disk", 00:14:56.065 "block_size": 512, 00:14:56.065 "num_blocks": 65536, 00:14:56.065 "uuid": "7cf6d91e-87de-43ea-abe1-d97d0e6813a0", 00:14:56.065 "assigned_rate_limits": { 00:14:56.065 "rw_ios_per_sec": 0, 00:14:56.065 "rw_mbytes_per_sec": 0, 00:14:56.065 "r_mbytes_per_sec": 0, 00:14:56.065 "w_mbytes_per_sec": 0 00:14:56.065 }, 00:14:56.065 "claimed": true, 00:14:56.065 "claim_type": "exclusive_write", 00:14:56.065 "zoned": false, 00:14:56.065 "supported_io_types": { 00:14:56.065 "read": true, 00:14:56.065 "write": true, 00:14:56.065 "unmap": true, 00:14:56.065 "write_zeroes": true, 00:14:56.065 "flush": true, 00:14:56.065 "reset": true, 00:14:56.065 "compare": false, 00:14:56.065 "compare_and_write": false, 00:14:56.065 "abort": true, 00:14:56.065 "nvme_admin": false, 00:14:56.065 "nvme_io": false 00:14:56.065 }, 00:14:56.065 "memory_domains": [ 00:14:56.065 { 00:14:56.065 "dma_device_id": "system", 00:14:56.065 "dma_device_type": 1 00:14:56.065 }, 00:14:56.065 { 00:14:56.065 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:56.065 "dma_device_type": 2 00:14:56.065 } 00:14:56.065 ], 00:14:56.065 "driver_specific": {} 00:14:56.065 } 00:14:56.065 ] 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:56.065 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:56.323 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:56.323 "name": "Existed_Raid", 00:14:56.323 "uuid": "78853e21-a8ca-46a4-8b5d-cc34964e3f34", 00:14:56.324 "strip_size_kb": 0, 00:14:56.324 "state": "online", 00:14:56.324 "raid_level": "raid1", 00:14:56.324 "superblock": false, 00:14:56.324 "num_base_bdevs": 3, 00:14:56.324 "num_base_bdevs_discovered": 3, 00:14:56.324 "num_base_bdevs_operational": 3, 00:14:56.324 "base_bdevs_list": [ 00:14:56.324 { 00:14:56.324 "name": "BaseBdev1", 00:14:56.324 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:56.324 "is_configured": true, 00:14:56.324 "data_offset": 0, 00:14:56.324 "data_size": 65536 00:14:56.324 }, 00:14:56.324 { 00:14:56.324 "name": "BaseBdev2", 00:14:56.324 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:56.324 "is_configured": true, 00:14:56.324 "data_offset": 0, 00:14:56.324 "data_size": 65536 00:14:56.324 }, 00:14:56.324 { 00:14:56.324 "name": "BaseBdev3", 00:14:56.324 "uuid": "7cf6d91e-87de-43ea-abe1-d97d0e6813a0", 00:14:56.324 "is_configured": true, 00:14:56.324 "data_offset": 0, 00:14:56.324 "data_size": 65536 00:14:56.324 } 00:14:56.324 ] 00:14:56.324 }' 00:14:56.324 15:54:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:56.324 15:54:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:14:56.891 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:14:57.150 [2024-06-10 15:54:02.589355] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:14:57.150 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:14:57.150 "name": "Existed_Raid", 00:14:57.150 "aliases": [ 00:14:57.150 "78853e21-a8ca-46a4-8b5d-cc34964e3f34" 00:14:57.150 ], 00:14:57.150 "product_name": "Raid Volume", 00:14:57.150 "block_size": 512, 00:14:57.150 "num_blocks": 65536, 00:14:57.150 "uuid": "78853e21-a8ca-46a4-8b5d-cc34964e3f34", 00:14:57.150 "assigned_rate_limits": { 00:14:57.150 "rw_ios_per_sec": 0, 00:14:57.150 "rw_mbytes_per_sec": 0, 00:14:57.150 "r_mbytes_per_sec": 0, 00:14:57.150 "w_mbytes_per_sec": 0 00:14:57.150 }, 00:14:57.150 "claimed": false, 00:14:57.150 "zoned": false, 00:14:57.150 "supported_io_types": { 00:14:57.150 "read": true, 00:14:57.150 "write": true, 00:14:57.150 "unmap": false, 00:14:57.150 "write_zeroes": true, 00:14:57.150 "flush": false, 00:14:57.150 "reset": true, 00:14:57.150 "compare": false, 00:14:57.150 "compare_and_write": false, 00:14:57.150 "abort": false, 00:14:57.150 "nvme_admin": false, 00:14:57.150 "nvme_io": false 00:14:57.150 }, 00:14:57.150 "memory_domains": [ 00:14:57.150 { 00:14:57.150 "dma_device_id": "system", 00:14:57.150 "dma_device_type": 1 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.150 "dma_device_type": 2 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "dma_device_id": "system", 00:14:57.150 "dma_device_type": 1 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.150 "dma_device_type": 2 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "dma_device_id": "system", 00:14:57.150 "dma_device_type": 1 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.150 "dma_device_type": 2 00:14:57.150 } 00:14:57.150 ], 00:14:57.150 "driver_specific": { 00:14:57.150 "raid": { 00:14:57.150 "uuid": "78853e21-a8ca-46a4-8b5d-cc34964e3f34", 00:14:57.150 "strip_size_kb": 0, 00:14:57.150 "state": "online", 00:14:57.150 "raid_level": "raid1", 00:14:57.150 "superblock": false, 00:14:57.150 "num_base_bdevs": 3, 00:14:57.150 "num_base_bdevs_discovered": 3, 00:14:57.150 "num_base_bdevs_operational": 3, 00:14:57.150 "base_bdevs_list": [ 00:14:57.150 { 00:14:57.150 "name": "BaseBdev1", 00:14:57.150 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:57.150 "is_configured": true, 00:14:57.150 "data_offset": 0, 00:14:57.150 "data_size": 65536 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "name": "BaseBdev2", 00:14:57.150 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:57.150 "is_configured": true, 00:14:57.150 "data_offset": 0, 00:14:57.150 "data_size": 65536 00:14:57.150 }, 00:14:57.150 { 00:14:57.150 "name": "BaseBdev3", 00:14:57.150 "uuid": "7cf6d91e-87de-43ea-abe1-d97d0e6813a0", 00:14:57.150 "is_configured": true, 00:14:57.150 "data_offset": 0, 00:14:57.150 "data_size": 65536 00:14:57.150 } 00:14:57.150 ] 00:14:57.150 } 00:14:57.150 } 00:14:57.150 }' 00:14:57.150 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:14:57.409 BaseBdev2 00:14:57.409 BaseBdev3' 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:57.409 "name": "BaseBdev1", 00:14:57.409 "aliases": [ 00:14:57.409 "754c8f7f-2197-4e99-ac8a-4a1f4153318a" 00:14:57.409 ], 00:14:57.409 "product_name": "Malloc disk", 00:14:57.409 "block_size": 512, 00:14:57.409 "num_blocks": 65536, 00:14:57.409 "uuid": "754c8f7f-2197-4e99-ac8a-4a1f4153318a", 00:14:57.409 "assigned_rate_limits": { 00:14:57.409 "rw_ios_per_sec": 0, 00:14:57.409 "rw_mbytes_per_sec": 0, 00:14:57.409 "r_mbytes_per_sec": 0, 00:14:57.409 "w_mbytes_per_sec": 0 00:14:57.409 }, 00:14:57.409 "claimed": true, 00:14:57.409 "claim_type": "exclusive_write", 00:14:57.409 "zoned": false, 00:14:57.409 "supported_io_types": { 00:14:57.409 "read": true, 00:14:57.409 "write": true, 00:14:57.409 "unmap": true, 00:14:57.409 "write_zeroes": true, 00:14:57.409 "flush": true, 00:14:57.409 "reset": true, 00:14:57.409 "compare": false, 00:14:57.409 "compare_and_write": false, 00:14:57.409 "abort": true, 00:14:57.409 "nvme_admin": false, 00:14:57.409 "nvme_io": false 00:14:57.409 }, 00:14:57.409 "memory_domains": [ 00:14:57.409 { 00:14:57.409 "dma_device_id": "system", 00:14:57.409 "dma_device_type": 1 00:14:57.409 }, 00:14:57.409 { 00:14:57.409 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:57.409 "dma_device_type": 2 00:14:57.409 } 00:14:57.409 ], 00:14:57.409 "driver_specific": {} 00:14:57.409 }' 00:14:57.409 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.668 15:54:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.668 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:14:57.927 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.187 "name": "BaseBdev2", 00:14:58.187 "aliases": [ 00:14:58.187 "83492614-1d2c-4b61-bf4e-3a19f3172c42" 00:14:58.187 ], 00:14:58.187 "product_name": "Malloc disk", 00:14:58.187 "block_size": 512, 00:14:58.187 "num_blocks": 65536, 00:14:58.187 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:58.187 "assigned_rate_limits": { 00:14:58.187 "rw_ios_per_sec": 0, 00:14:58.187 "rw_mbytes_per_sec": 0, 00:14:58.187 "r_mbytes_per_sec": 0, 00:14:58.187 "w_mbytes_per_sec": 0 00:14:58.187 }, 00:14:58.187 "claimed": true, 00:14:58.187 "claim_type": "exclusive_write", 00:14:58.187 "zoned": false, 00:14:58.187 "supported_io_types": { 00:14:58.187 "read": true, 00:14:58.187 "write": true, 00:14:58.187 "unmap": true, 00:14:58.187 "write_zeroes": true, 00:14:58.187 "flush": true, 00:14:58.187 "reset": true, 00:14:58.187 "compare": false, 00:14:58.187 "compare_and_write": false, 00:14:58.187 "abort": true, 00:14:58.187 "nvme_admin": false, 00:14:58.187 "nvme_io": false 00:14:58.187 }, 00:14:58.187 "memory_domains": [ 00:14:58.187 { 00:14:58.187 "dma_device_id": "system", 00:14:58.187 "dma_device_type": 1 00:14:58.187 }, 00:14:58.187 { 00:14:58.187 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.187 "dma_device_type": 2 00:14:58.187 } 00:14:58.187 ], 00:14:58.187 "driver_specific": {} 00:14:58.187 }' 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.187 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:14:58.446 15:54:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:14:58.705 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:14:58.705 "name": "BaseBdev3", 00:14:58.705 "aliases": [ 00:14:58.705 "7cf6d91e-87de-43ea-abe1-d97d0e6813a0" 00:14:58.705 ], 00:14:58.705 "product_name": "Malloc disk", 00:14:58.705 "block_size": 512, 00:14:58.705 "num_blocks": 65536, 00:14:58.705 "uuid": "7cf6d91e-87de-43ea-abe1-d97d0e6813a0", 00:14:58.705 "assigned_rate_limits": { 00:14:58.705 "rw_ios_per_sec": 0, 00:14:58.705 "rw_mbytes_per_sec": 0, 00:14:58.705 "r_mbytes_per_sec": 0, 00:14:58.705 "w_mbytes_per_sec": 0 00:14:58.705 }, 00:14:58.705 "claimed": true, 00:14:58.705 "claim_type": "exclusive_write", 00:14:58.705 "zoned": false, 00:14:58.705 "supported_io_types": { 00:14:58.705 "read": true, 00:14:58.705 "write": true, 00:14:58.705 "unmap": true, 00:14:58.705 "write_zeroes": true, 00:14:58.705 "flush": true, 00:14:58.705 "reset": true, 00:14:58.705 "compare": false, 00:14:58.705 "compare_and_write": false, 00:14:58.705 "abort": true, 00:14:58.705 "nvme_admin": false, 00:14:58.705 "nvme_io": false 00:14:58.705 }, 00:14:58.705 "memory_domains": [ 00:14:58.705 { 00:14:58.705 "dma_device_id": "system", 00:14:58.705 "dma_device_type": 1 00:14:58.705 }, 00:14:58.705 { 00:14:58.705 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:14:58.705 "dma_device_type": 2 00:14:58.705 } 00:14:58.705 ], 00:14:58.705 "driver_specific": {} 00:14:58.705 }' 00:14:58.705 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.705 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:14:58.964 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.223 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:14:59.223 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:14:59.223 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:14:59.482 [2024-06-10 15:54:04.734870] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:14:59.482 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:14:59.483 15:54:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:14:59.742 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:14:59.742 "name": "Existed_Raid", 00:14:59.742 "uuid": "78853e21-a8ca-46a4-8b5d-cc34964e3f34", 00:14:59.742 "strip_size_kb": 0, 00:14:59.742 "state": "online", 00:14:59.742 "raid_level": "raid1", 00:14:59.742 "superblock": false, 00:14:59.742 "num_base_bdevs": 3, 00:14:59.742 "num_base_bdevs_discovered": 2, 00:14:59.742 "num_base_bdevs_operational": 2, 00:14:59.742 "base_bdevs_list": [ 00:14:59.742 { 00:14:59.742 "name": null, 00:14:59.742 "uuid": "00000000-0000-0000-0000-000000000000", 00:14:59.742 "is_configured": false, 00:14:59.742 "data_offset": 0, 00:14:59.742 "data_size": 65536 00:14:59.742 }, 00:14:59.742 { 00:14:59.742 "name": "BaseBdev2", 00:14:59.742 "uuid": "83492614-1d2c-4b61-bf4e-3a19f3172c42", 00:14:59.742 "is_configured": true, 00:14:59.742 "data_offset": 0, 00:14:59.742 "data_size": 65536 00:14:59.742 }, 00:14:59.742 { 00:14:59.742 "name": "BaseBdev3", 00:14:59.742 "uuid": "7cf6d91e-87de-43ea-abe1-d97d0e6813a0", 00:14:59.742 "is_configured": true, 00:14:59.742 "data_offset": 0, 00:14:59.742 "data_size": 65536 00:14:59.742 } 00:14:59.742 ] 00:14:59.742 }' 00:14:59.742 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:14:59.742 15:54:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:00.311 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:00.311 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:00.311 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.311 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:00.570 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:00.570 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:00.570 15:54:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:00.828 [2024-06-10 15:54:06.095585] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:00.828 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:00.828 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:00.828 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:00.828 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:01.088 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:01.088 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:01.088 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:01.347 [2024-06-10 15:54:06.615384] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:01.347 [2024-06-10 15:54:06.615463] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:01.347 [2024-06-10 15:54:06.626204] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:01.347 [2024-06-10 15:54:06.626237] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:01.347 [2024-06-10 15:54:06.626245] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c58c0 name Existed_Raid, state offline 00:15:01.347 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:01.347 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:01.347 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:01.347 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:01.606 15:54:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:01.865 BaseBdev2 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:01.865 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:02.123 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:02.382 [ 00:15:02.382 { 00:15:02.382 "name": "BaseBdev2", 00:15:02.382 "aliases": [ 00:15:02.382 "4292489d-7677-4278-8dd6-e7a5e0f60b7d" 00:15:02.382 ], 00:15:02.382 "product_name": "Malloc disk", 00:15:02.382 "block_size": 512, 00:15:02.382 "num_blocks": 65536, 00:15:02.382 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:02.382 "assigned_rate_limits": { 00:15:02.382 "rw_ios_per_sec": 0, 00:15:02.382 "rw_mbytes_per_sec": 0, 00:15:02.382 "r_mbytes_per_sec": 0, 00:15:02.382 "w_mbytes_per_sec": 0 00:15:02.382 }, 00:15:02.382 "claimed": false, 00:15:02.382 "zoned": false, 00:15:02.382 "supported_io_types": { 00:15:02.382 "read": true, 00:15:02.382 "write": true, 00:15:02.382 "unmap": true, 00:15:02.382 "write_zeroes": true, 00:15:02.382 "flush": true, 00:15:02.382 "reset": true, 00:15:02.382 "compare": false, 00:15:02.382 "compare_and_write": false, 00:15:02.382 "abort": true, 00:15:02.382 "nvme_admin": false, 00:15:02.382 "nvme_io": false 00:15:02.382 }, 00:15:02.382 "memory_domains": [ 00:15:02.382 { 00:15:02.382 "dma_device_id": "system", 00:15:02.382 "dma_device_type": 1 00:15:02.382 }, 00:15:02.382 { 00:15:02.382 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:02.382 "dma_device_type": 2 00:15:02.382 } 00:15:02.382 ], 00:15:02.382 "driver_specific": {} 00:15:02.382 } 00:15:02.382 ] 00:15:02.382 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:02.382 15:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:02.382 15:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:02.382 15:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:02.640 BaseBdev3 00:15:02.640 15:54:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:02.640 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:02.641 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:02.641 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:02.641 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:02.641 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:02.641 15:54:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:02.899 15:54:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:03.158 [ 00:15:03.158 { 00:15:03.158 "name": "BaseBdev3", 00:15:03.158 "aliases": [ 00:15:03.158 "b05e4a6e-7b47-4bc4-9470-ca02a000445a" 00:15:03.158 ], 00:15:03.158 "product_name": "Malloc disk", 00:15:03.158 "block_size": 512, 00:15:03.158 "num_blocks": 65536, 00:15:03.158 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:03.158 "assigned_rate_limits": { 00:15:03.158 "rw_ios_per_sec": 0, 00:15:03.158 "rw_mbytes_per_sec": 0, 00:15:03.158 "r_mbytes_per_sec": 0, 00:15:03.158 "w_mbytes_per_sec": 0 00:15:03.158 }, 00:15:03.158 "claimed": false, 00:15:03.158 "zoned": false, 00:15:03.158 "supported_io_types": { 00:15:03.158 "read": true, 00:15:03.158 "write": true, 00:15:03.158 "unmap": true, 00:15:03.158 "write_zeroes": true, 00:15:03.158 "flush": true, 00:15:03.158 "reset": true, 00:15:03.158 "compare": false, 00:15:03.158 "compare_and_write": false, 00:15:03.158 "abort": true, 00:15:03.158 "nvme_admin": false, 00:15:03.158 "nvme_io": false 00:15:03.158 }, 00:15:03.158 "memory_domains": [ 00:15:03.158 { 00:15:03.158 "dma_device_id": "system", 00:15:03.158 "dma_device_type": 1 00:15:03.158 }, 00:15:03.158 { 00:15:03.158 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:03.158 "dma_device_type": 2 00:15:03.158 } 00:15:03.158 ], 00:15:03.158 "driver_specific": {} 00:15:03.158 } 00:15:03.158 ] 00:15:03.158 15:54:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:03.158 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:03.158 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:03.158 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:03.417 [2024-06-10 15:54:08.667982] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:03.417 [2024-06-10 15:54:08.668023] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:03.417 [2024-06-10 15:54:08.668042] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:03.417 [2024-06-10 15:54:08.669478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:03.417 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:03.677 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:03.677 "name": "Existed_Raid", 00:15:03.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.677 "strip_size_kb": 0, 00:15:03.677 "state": "configuring", 00:15:03.677 "raid_level": "raid1", 00:15:03.677 "superblock": false, 00:15:03.677 "num_base_bdevs": 3, 00:15:03.677 "num_base_bdevs_discovered": 2, 00:15:03.677 "num_base_bdevs_operational": 3, 00:15:03.677 "base_bdevs_list": [ 00:15:03.677 { 00:15:03.677 "name": "BaseBdev1", 00:15:03.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:03.677 "is_configured": false, 00:15:03.677 "data_offset": 0, 00:15:03.677 "data_size": 0 00:15:03.677 }, 00:15:03.677 { 00:15:03.677 "name": "BaseBdev2", 00:15:03.677 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:03.677 "is_configured": true, 00:15:03.677 "data_offset": 0, 00:15:03.677 "data_size": 65536 00:15:03.677 }, 00:15:03.677 { 00:15:03.677 "name": "BaseBdev3", 00:15:03.677 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:03.677 "is_configured": true, 00:15:03.677 "data_offset": 0, 00:15:03.677 "data_size": 65536 00:15:03.677 } 00:15:03.677 ] 00:15:03.677 }' 00:15:03.677 15:54:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:03.677 15:54:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:04.244 [2024-06-10 15:54:09.714762] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:04.244 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:04.245 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:04.505 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:04.505 "name": "Existed_Raid", 00:15:04.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.505 "strip_size_kb": 0, 00:15:04.505 "state": "configuring", 00:15:04.505 "raid_level": "raid1", 00:15:04.505 "superblock": false, 00:15:04.505 "num_base_bdevs": 3, 00:15:04.505 "num_base_bdevs_discovered": 1, 00:15:04.505 "num_base_bdevs_operational": 3, 00:15:04.505 "base_bdevs_list": [ 00:15:04.505 { 00:15:04.505 "name": "BaseBdev1", 00:15:04.505 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:04.505 "is_configured": false, 00:15:04.505 "data_offset": 0, 00:15:04.505 "data_size": 0 00:15:04.505 }, 00:15:04.505 { 00:15:04.505 "name": null, 00:15:04.505 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:04.505 "is_configured": false, 00:15:04.505 "data_offset": 0, 00:15:04.505 "data_size": 65536 00:15:04.505 }, 00:15:04.505 { 00:15:04.505 "name": "BaseBdev3", 00:15:04.505 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:04.505 "is_configured": true, 00:15:04.505 "data_offset": 0, 00:15:04.505 "data_size": 65536 00:15:04.505 } 00:15:04.505 ] 00:15:04.505 }' 00:15:04.505 15:54:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:04.505 15:54:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:05.074 15:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:05.074 15:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:05.333 15:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:05.333 15:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:05.591 [2024-06-10 15:54:10.973283] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:05.591 BaseBdev1 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:05.591 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:05.592 15:54:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:05.850 15:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:06.109 [ 00:15:06.109 { 00:15:06.109 "name": "BaseBdev1", 00:15:06.109 "aliases": [ 00:15:06.110 "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc" 00:15:06.110 ], 00:15:06.110 "product_name": "Malloc disk", 00:15:06.110 "block_size": 512, 00:15:06.110 "num_blocks": 65536, 00:15:06.110 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:06.110 "assigned_rate_limits": { 00:15:06.110 "rw_ios_per_sec": 0, 00:15:06.110 "rw_mbytes_per_sec": 0, 00:15:06.110 "r_mbytes_per_sec": 0, 00:15:06.110 "w_mbytes_per_sec": 0 00:15:06.110 }, 00:15:06.110 "claimed": true, 00:15:06.110 "claim_type": "exclusive_write", 00:15:06.110 "zoned": false, 00:15:06.110 "supported_io_types": { 00:15:06.110 "read": true, 00:15:06.110 "write": true, 00:15:06.110 "unmap": true, 00:15:06.110 "write_zeroes": true, 00:15:06.110 "flush": true, 00:15:06.110 "reset": true, 00:15:06.110 "compare": false, 00:15:06.110 "compare_and_write": false, 00:15:06.110 "abort": true, 00:15:06.110 "nvme_admin": false, 00:15:06.110 "nvme_io": false 00:15:06.110 }, 00:15:06.110 "memory_domains": [ 00:15:06.110 { 00:15:06.110 "dma_device_id": "system", 00:15:06.110 "dma_device_type": 1 00:15:06.110 }, 00:15:06.110 { 00:15:06.110 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:06.110 "dma_device_type": 2 00:15:06.110 } 00:15:06.110 ], 00:15:06.110 "driver_specific": {} 00:15:06.110 } 00:15:06.110 ] 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.110 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:06.369 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:06.369 "name": "Existed_Raid", 00:15:06.369 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:06.369 "strip_size_kb": 0, 00:15:06.369 "state": "configuring", 00:15:06.369 "raid_level": "raid1", 00:15:06.369 "superblock": false, 00:15:06.369 "num_base_bdevs": 3, 00:15:06.369 "num_base_bdevs_discovered": 2, 00:15:06.369 "num_base_bdevs_operational": 3, 00:15:06.369 "base_bdevs_list": [ 00:15:06.369 { 00:15:06.369 "name": "BaseBdev1", 00:15:06.369 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:06.369 "is_configured": true, 00:15:06.369 "data_offset": 0, 00:15:06.369 "data_size": 65536 00:15:06.369 }, 00:15:06.369 { 00:15:06.369 "name": null, 00:15:06.369 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:06.369 "is_configured": false, 00:15:06.369 "data_offset": 0, 00:15:06.369 "data_size": 65536 00:15:06.369 }, 00:15:06.369 { 00:15:06.369 "name": "BaseBdev3", 00:15:06.369 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:06.369 "is_configured": true, 00:15:06.369 "data_offset": 0, 00:15:06.369 "data_size": 65536 00:15:06.369 } 00:15:06.369 ] 00:15:06.369 }' 00:15:06.369 15:54:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:06.369 15:54:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:06.937 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:06.937 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:07.227 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:07.227 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:07.486 [2024-06-10 15:54:12.830292] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:07.486 15:54:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:07.745 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:07.745 "name": "Existed_Raid", 00:15:07.745 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:07.745 "strip_size_kb": 0, 00:15:07.746 "state": "configuring", 00:15:07.746 "raid_level": "raid1", 00:15:07.746 "superblock": false, 00:15:07.746 "num_base_bdevs": 3, 00:15:07.746 "num_base_bdevs_discovered": 1, 00:15:07.746 "num_base_bdevs_operational": 3, 00:15:07.746 "base_bdevs_list": [ 00:15:07.746 { 00:15:07.746 "name": "BaseBdev1", 00:15:07.746 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:07.746 "is_configured": true, 00:15:07.746 "data_offset": 0, 00:15:07.746 "data_size": 65536 00:15:07.746 }, 00:15:07.746 { 00:15:07.746 "name": null, 00:15:07.746 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:07.746 "is_configured": false, 00:15:07.746 "data_offset": 0, 00:15:07.746 "data_size": 65536 00:15:07.746 }, 00:15:07.746 { 00:15:07.746 "name": null, 00:15:07.746 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:07.746 "is_configured": false, 00:15:07.746 "data_offset": 0, 00:15:07.746 "data_size": 65536 00:15:07.746 } 00:15:07.746 ] 00:15:07.746 }' 00:15:07.746 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:07.746 15:54:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:08.313 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:08.313 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.572 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:08.572 15:54:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:08.830 [2024-06-10 15:54:14.153846] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:08.830 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:08.830 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:08.830 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:08.831 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:09.089 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:09.089 "name": "Existed_Raid", 00:15:09.089 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:09.089 "strip_size_kb": 0, 00:15:09.089 "state": "configuring", 00:15:09.089 "raid_level": "raid1", 00:15:09.089 "superblock": false, 00:15:09.089 "num_base_bdevs": 3, 00:15:09.089 "num_base_bdevs_discovered": 2, 00:15:09.089 "num_base_bdevs_operational": 3, 00:15:09.089 "base_bdevs_list": [ 00:15:09.089 { 00:15:09.089 "name": "BaseBdev1", 00:15:09.089 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:09.089 "is_configured": true, 00:15:09.089 "data_offset": 0, 00:15:09.089 "data_size": 65536 00:15:09.089 }, 00:15:09.089 { 00:15:09.089 "name": null, 00:15:09.089 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:09.089 "is_configured": false, 00:15:09.089 "data_offset": 0, 00:15:09.089 "data_size": 65536 00:15:09.089 }, 00:15:09.089 { 00:15:09.089 "name": "BaseBdev3", 00:15:09.089 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:09.089 "is_configured": true, 00:15:09.089 "data_offset": 0, 00:15:09.089 "data_size": 65536 00:15:09.089 } 00:15:09.089 ] 00:15:09.089 }' 00:15:09.089 15:54:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:09.089 15:54:14 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:09.657 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:09.657 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:09.917 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:09.917 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:10.175 [2024-06-10 15:54:15.509490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:10.176 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:10.434 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:10.434 "name": "Existed_Raid", 00:15:10.434 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:10.434 "strip_size_kb": 0, 00:15:10.434 "state": "configuring", 00:15:10.434 "raid_level": "raid1", 00:15:10.434 "superblock": false, 00:15:10.434 "num_base_bdevs": 3, 00:15:10.434 "num_base_bdevs_discovered": 1, 00:15:10.434 "num_base_bdevs_operational": 3, 00:15:10.434 "base_bdevs_list": [ 00:15:10.434 { 00:15:10.434 "name": null, 00:15:10.434 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:10.434 "is_configured": false, 00:15:10.434 "data_offset": 0, 00:15:10.434 "data_size": 65536 00:15:10.434 }, 00:15:10.434 { 00:15:10.434 "name": null, 00:15:10.434 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:10.434 "is_configured": false, 00:15:10.434 "data_offset": 0, 00:15:10.434 "data_size": 65536 00:15:10.434 }, 00:15:10.434 { 00:15:10.434 "name": "BaseBdev3", 00:15:10.434 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:10.434 "is_configured": true, 00:15:10.434 "data_offset": 0, 00:15:10.434 "data_size": 65536 00:15:10.434 } 00:15:10.434 ] 00:15:10.434 }' 00:15:10.434 15:54:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:10.434 15:54:15 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:11.001 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.001 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:11.259 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:11.259 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:11.518 [2024-06-10 15:54:16.895819] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:11.518 15:54:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:11.777 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:11.777 "name": "Existed_Raid", 00:15:11.777 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:11.777 "strip_size_kb": 0, 00:15:11.777 "state": "configuring", 00:15:11.777 "raid_level": "raid1", 00:15:11.777 "superblock": false, 00:15:11.777 "num_base_bdevs": 3, 00:15:11.777 "num_base_bdevs_discovered": 2, 00:15:11.777 "num_base_bdevs_operational": 3, 00:15:11.777 "base_bdevs_list": [ 00:15:11.777 { 00:15:11.777 "name": null, 00:15:11.777 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:11.777 "is_configured": false, 00:15:11.777 "data_offset": 0, 00:15:11.777 "data_size": 65536 00:15:11.777 }, 00:15:11.777 { 00:15:11.777 "name": "BaseBdev2", 00:15:11.777 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:11.777 "is_configured": true, 00:15:11.777 "data_offset": 0, 00:15:11.777 "data_size": 65536 00:15:11.777 }, 00:15:11.777 { 00:15:11.777 "name": "BaseBdev3", 00:15:11.777 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:11.777 "is_configured": true, 00:15:11.777 "data_offset": 0, 00:15:11.777 "data_size": 65536 00:15:11.777 } 00:15:11.777 ] 00:15:11.777 }' 00:15:11.777 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:11.777 15:54:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:12.344 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.344 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:12.603 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:12.603 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:12.603 15:54:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:12.862 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 1c52c9c0-1f9a-4903-a93b-8d195f9a38fc 00:15:13.120 [2024-06-10 15:54:18.411186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:13.121 [2024-06-10 15:54:18.411223] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12c66f0 00:15:13.121 [2024-06-10 15:54:18.411230] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:15:13.121 [2024-06-10 15:54:18.411433] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x146f280 00:15:13.121 [2024-06-10 15:54:18.411560] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12c66f0 00:15:13.121 [2024-06-10 15:54:18.411573] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12c66f0 00:15:13.121 [2024-06-10 15:54:18.411732] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:13.121 NewBaseBdev 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:13.121 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:13.379 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:13.379 [ 00:15:13.379 { 00:15:13.379 "name": "NewBaseBdev", 00:15:13.379 "aliases": [ 00:15:13.379 "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc" 00:15:13.379 ], 00:15:13.379 "product_name": "Malloc disk", 00:15:13.379 "block_size": 512, 00:15:13.379 "num_blocks": 65536, 00:15:13.379 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:13.379 "assigned_rate_limits": { 00:15:13.379 "rw_ios_per_sec": 0, 00:15:13.379 "rw_mbytes_per_sec": 0, 00:15:13.379 "r_mbytes_per_sec": 0, 00:15:13.379 "w_mbytes_per_sec": 0 00:15:13.379 }, 00:15:13.379 "claimed": true, 00:15:13.379 "claim_type": "exclusive_write", 00:15:13.379 "zoned": false, 00:15:13.379 "supported_io_types": { 00:15:13.379 "read": true, 00:15:13.379 "write": true, 00:15:13.379 "unmap": true, 00:15:13.379 "write_zeroes": true, 00:15:13.379 "flush": true, 00:15:13.379 "reset": true, 00:15:13.379 "compare": false, 00:15:13.379 "compare_and_write": false, 00:15:13.379 "abort": true, 00:15:13.379 "nvme_admin": false, 00:15:13.380 "nvme_io": false 00:15:13.380 }, 00:15:13.380 "memory_domains": [ 00:15:13.380 { 00:15:13.380 "dma_device_id": "system", 00:15:13.380 "dma_device_type": 1 00:15:13.380 }, 00:15:13.380 { 00:15:13.380 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:13.380 "dma_device_type": 2 00:15:13.380 } 00:15:13.380 ], 00:15:13.380 "driver_specific": {} 00:15:13.380 } 00:15:13.380 ] 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:13.380 15:54:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:13.638 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:13.638 "name": "Existed_Raid", 00:15:13.638 "uuid": "890d1f6c-f71b-42ca-828b-10ff1c3dd3d1", 00:15:13.638 "strip_size_kb": 0, 00:15:13.638 "state": "online", 00:15:13.638 "raid_level": "raid1", 00:15:13.638 "superblock": false, 00:15:13.638 "num_base_bdevs": 3, 00:15:13.638 "num_base_bdevs_discovered": 3, 00:15:13.638 "num_base_bdevs_operational": 3, 00:15:13.638 "base_bdevs_list": [ 00:15:13.638 { 00:15:13.638 "name": "NewBaseBdev", 00:15:13.638 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:13.638 "is_configured": true, 00:15:13.638 "data_offset": 0, 00:15:13.638 "data_size": 65536 00:15:13.638 }, 00:15:13.638 { 00:15:13.638 "name": "BaseBdev2", 00:15:13.638 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:13.638 "is_configured": true, 00:15:13.638 "data_offset": 0, 00:15:13.638 "data_size": 65536 00:15:13.638 }, 00:15:13.638 { 00:15:13.638 "name": "BaseBdev3", 00:15:13.638 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:13.638 "is_configured": true, 00:15:13.638 "data_offset": 0, 00:15:13.638 "data_size": 65536 00:15:13.639 } 00:15:13.639 ] 00:15:13.639 }' 00:15:13.639 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:13.639 15:54:19 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:14.575 [2024-06-10 15:54:19.955685] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:14.575 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:14.575 "name": "Existed_Raid", 00:15:14.575 "aliases": [ 00:15:14.575 "890d1f6c-f71b-42ca-828b-10ff1c3dd3d1" 00:15:14.575 ], 00:15:14.575 "product_name": "Raid Volume", 00:15:14.575 "block_size": 512, 00:15:14.575 "num_blocks": 65536, 00:15:14.575 "uuid": "890d1f6c-f71b-42ca-828b-10ff1c3dd3d1", 00:15:14.575 "assigned_rate_limits": { 00:15:14.575 "rw_ios_per_sec": 0, 00:15:14.575 "rw_mbytes_per_sec": 0, 00:15:14.575 "r_mbytes_per_sec": 0, 00:15:14.575 "w_mbytes_per_sec": 0 00:15:14.575 }, 00:15:14.575 "claimed": false, 00:15:14.575 "zoned": false, 00:15:14.575 "supported_io_types": { 00:15:14.575 "read": true, 00:15:14.575 "write": true, 00:15:14.575 "unmap": false, 00:15:14.575 "write_zeroes": true, 00:15:14.575 "flush": false, 00:15:14.575 "reset": true, 00:15:14.575 "compare": false, 00:15:14.575 "compare_and_write": false, 00:15:14.575 "abort": false, 00:15:14.575 "nvme_admin": false, 00:15:14.575 "nvme_io": false 00:15:14.575 }, 00:15:14.575 "memory_domains": [ 00:15:14.575 { 00:15:14.575 "dma_device_id": "system", 00:15:14.575 "dma_device_type": 1 00:15:14.575 }, 00:15:14.575 { 00:15:14.575 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.575 "dma_device_type": 2 00:15:14.575 }, 00:15:14.575 { 00:15:14.575 "dma_device_id": "system", 00:15:14.575 "dma_device_type": 1 00:15:14.575 }, 00:15:14.575 { 00:15:14.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.576 "dma_device_type": 2 00:15:14.576 }, 00:15:14.576 { 00:15:14.576 "dma_device_id": "system", 00:15:14.576 "dma_device_type": 1 00:15:14.576 }, 00:15:14.576 { 00:15:14.576 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.576 "dma_device_type": 2 00:15:14.576 } 00:15:14.576 ], 00:15:14.576 "driver_specific": { 00:15:14.576 "raid": { 00:15:14.576 "uuid": "890d1f6c-f71b-42ca-828b-10ff1c3dd3d1", 00:15:14.576 "strip_size_kb": 0, 00:15:14.576 "state": "online", 00:15:14.576 "raid_level": "raid1", 00:15:14.576 "superblock": false, 00:15:14.576 "num_base_bdevs": 3, 00:15:14.576 "num_base_bdevs_discovered": 3, 00:15:14.576 "num_base_bdevs_operational": 3, 00:15:14.576 "base_bdevs_list": [ 00:15:14.576 { 00:15:14.576 "name": "NewBaseBdev", 00:15:14.576 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:14.576 "is_configured": true, 00:15:14.576 "data_offset": 0, 00:15:14.576 "data_size": 65536 00:15:14.576 }, 00:15:14.576 { 00:15:14.576 "name": "BaseBdev2", 00:15:14.576 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:14.576 "is_configured": true, 00:15:14.576 "data_offset": 0, 00:15:14.576 "data_size": 65536 00:15:14.576 }, 00:15:14.576 { 00:15:14.576 "name": "BaseBdev3", 00:15:14.576 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:14.576 "is_configured": true, 00:15:14.576 "data_offset": 0, 00:15:14.576 "data_size": 65536 00:15:14.576 } 00:15:14.576 ] 00:15:14.576 } 00:15:14.576 } 00:15:14.576 }' 00:15:14.576 15:54:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:14.576 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:14.576 BaseBdev2 00:15:14.576 BaseBdev3' 00:15:14.576 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:14.576 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:14.576 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:14.835 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:14.835 "name": "NewBaseBdev", 00:15:14.835 "aliases": [ 00:15:14.835 "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc" 00:15:14.835 ], 00:15:14.835 "product_name": "Malloc disk", 00:15:14.835 "block_size": 512, 00:15:14.835 "num_blocks": 65536, 00:15:14.835 "uuid": "1c52c9c0-1f9a-4903-a93b-8d195f9a38fc", 00:15:14.835 "assigned_rate_limits": { 00:15:14.835 "rw_ios_per_sec": 0, 00:15:14.835 "rw_mbytes_per_sec": 0, 00:15:14.835 "r_mbytes_per_sec": 0, 00:15:14.835 "w_mbytes_per_sec": 0 00:15:14.835 }, 00:15:14.835 "claimed": true, 00:15:14.835 "claim_type": "exclusive_write", 00:15:14.835 "zoned": false, 00:15:14.835 "supported_io_types": { 00:15:14.835 "read": true, 00:15:14.835 "write": true, 00:15:14.835 "unmap": true, 00:15:14.835 "write_zeroes": true, 00:15:14.835 "flush": true, 00:15:14.835 "reset": true, 00:15:14.835 "compare": false, 00:15:14.835 "compare_and_write": false, 00:15:14.835 "abort": true, 00:15:14.835 "nvme_admin": false, 00:15:14.835 "nvme_io": false 00:15:14.835 }, 00:15:14.835 "memory_domains": [ 00:15:14.835 { 00:15:14.835 "dma_device_id": "system", 00:15:14.835 "dma_device_type": 1 00:15:14.835 }, 00:15:14.835 { 00:15:14.835 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:14.835 "dma_device_type": 2 00:15:14.835 } 00:15:14.835 ], 00:15:14.835 "driver_specific": {} 00:15:14.835 }' 00:15:14.835 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:14.835 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.095 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:15.354 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:15.354 "name": "BaseBdev2", 00:15:15.354 "aliases": [ 00:15:15.354 "4292489d-7677-4278-8dd6-e7a5e0f60b7d" 00:15:15.354 ], 00:15:15.354 "product_name": "Malloc disk", 00:15:15.354 "block_size": 512, 00:15:15.354 "num_blocks": 65536, 00:15:15.354 "uuid": "4292489d-7677-4278-8dd6-e7a5e0f60b7d", 00:15:15.354 "assigned_rate_limits": { 00:15:15.354 "rw_ios_per_sec": 0, 00:15:15.354 "rw_mbytes_per_sec": 0, 00:15:15.354 "r_mbytes_per_sec": 0, 00:15:15.354 "w_mbytes_per_sec": 0 00:15:15.354 }, 00:15:15.354 "claimed": true, 00:15:15.354 "claim_type": "exclusive_write", 00:15:15.354 "zoned": false, 00:15:15.354 "supported_io_types": { 00:15:15.354 "read": true, 00:15:15.354 "write": true, 00:15:15.354 "unmap": true, 00:15:15.354 "write_zeroes": true, 00:15:15.354 "flush": true, 00:15:15.354 "reset": true, 00:15:15.354 "compare": false, 00:15:15.354 "compare_and_write": false, 00:15:15.354 "abort": true, 00:15:15.354 "nvme_admin": false, 00:15:15.354 "nvme_io": false 00:15:15.354 }, 00:15:15.354 "memory_domains": [ 00:15:15.354 { 00:15:15.354 "dma_device_id": "system", 00:15:15.354 "dma_device_type": 1 00:15:15.354 }, 00:15:15.354 { 00:15:15.354 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:15.354 "dma_device_type": 2 00:15:15.354 } 00:15:15.354 ], 00:15:15.354 "driver_specific": {} 00:15:15.354 }' 00:15:15.354 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.612 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:15.612 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:15.612 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.612 15:54:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:15.612 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:15.612 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.613 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:15.613 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:15.613 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.871 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:15.871 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:15.871 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:15.871 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:15.871 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:16.130 "name": "BaseBdev3", 00:15:16.130 "aliases": [ 00:15:16.130 "b05e4a6e-7b47-4bc4-9470-ca02a000445a" 00:15:16.130 ], 00:15:16.130 "product_name": "Malloc disk", 00:15:16.130 "block_size": 512, 00:15:16.130 "num_blocks": 65536, 00:15:16.130 "uuid": "b05e4a6e-7b47-4bc4-9470-ca02a000445a", 00:15:16.130 "assigned_rate_limits": { 00:15:16.130 "rw_ios_per_sec": 0, 00:15:16.130 "rw_mbytes_per_sec": 0, 00:15:16.130 "r_mbytes_per_sec": 0, 00:15:16.130 "w_mbytes_per_sec": 0 00:15:16.130 }, 00:15:16.130 "claimed": true, 00:15:16.130 "claim_type": "exclusive_write", 00:15:16.130 "zoned": false, 00:15:16.130 "supported_io_types": { 00:15:16.130 "read": true, 00:15:16.130 "write": true, 00:15:16.130 "unmap": true, 00:15:16.130 "write_zeroes": true, 00:15:16.130 "flush": true, 00:15:16.130 "reset": true, 00:15:16.130 "compare": false, 00:15:16.130 "compare_and_write": false, 00:15:16.130 "abort": true, 00:15:16.130 "nvme_admin": false, 00:15:16.130 "nvme_io": false 00:15:16.130 }, 00:15:16.130 "memory_domains": [ 00:15:16.130 { 00:15:16.130 "dma_device_id": "system", 00:15:16.130 "dma_device_type": 1 00:15:16.130 }, 00:15:16.130 { 00:15:16.130 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:16.130 "dma_device_type": 2 00:15:16.130 } 00:15:16.130 ], 00:15:16.130 "driver_specific": {} 00:15:16.130 }' 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.130 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:16.389 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:16.389 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.389 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:16.389 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:16.389 15:54:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:16.648 [2024-06-10 15:54:21.988851] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:16.648 [2024-06-10 15:54:21.988874] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:16.648 [2024-06-10 15:54:21.988923] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:16.648 [2024-06-10 15:54:21.989201] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:16.648 [2024-06-10 15:54:21.989216] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12c66f0 name Existed_Raid, state offline 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2688845 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2688845 ']' 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2688845 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2688845 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2688845' 00:15:16.648 killing process with pid 2688845 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2688845 00:15:16.648 [2024-06-10 15:54:22.053213] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:16.648 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2688845 00:15:16.648 [2024-06-10 15:54:22.078330] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:15:16.907 00:15:16.907 real 0m28.741s 00:15:16.907 user 0m53.884s 00:15:16.907 sys 0m4.022s 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:15:16.907 ************************************ 00:15:16.907 END TEST raid_state_function_test 00:15:16.907 ************************************ 00:15:16.907 15:54:22 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 3 true 00:15:16.907 15:54:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:15:16.907 15:54:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:16.907 15:54:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:16.907 ************************************ 00:15:16.907 START TEST raid_state_function_test_sb 00:15:16.907 ************************************ 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 3 true 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=3 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2694727 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2694727' 00:15:16.907 Process raid pid: 2694727 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2694727 /var/tmp/spdk-raid.sock 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2694727 ']' 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:16.907 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:16.907 15:54:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:16.907 [2024-06-10 15:54:22.399249] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:15:16.907 [2024-06-10 15:54:22.399302] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:15:17.166 [2024-06-10 15:54:22.500426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:17.166 [2024-06-10 15:54:22.594641] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:17.166 [2024-06-10 15:54:22.655534] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:17.166 [2024-06-10 15:54:22.655552] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:18.102 15:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:18.102 15:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:15:18.102 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:18.102 [2024-06-10 15:54:23.598078] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:18.102 [2024-06-10 15:54:23.598116] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:18.102 [2024-06-10 15:54:23.598125] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:18.102 [2024-06-10 15:54:23.598134] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:18.102 [2024-06-10 15:54:23.598141] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:18.102 [2024-06-10 15:54:23.598149] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:18.361 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:18.362 "name": "Existed_Raid", 00:15:18.362 "uuid": "3b37a9c3-7b7a-489f-9378-018065733515", 00:15:18.362 "strip_size_kb": 0, 00:15:18.362 "state": "configuring", 00:15:18.362 "raid_level": "raid1", 00:15:18.362 "superblock": true, 00:15:18.362 "num_base_bdevs": 3, 00:15:18.362 "num_base_bdevs_discovered": 0, 00:15:18.362 "num_base_bdevs_operational": 3, 00:15:18.362 "base_bdevs_list": [ 00:15:18.362 { 00:15:18.362 "name": "BaseBdev1", 00:15:18.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.362 "is_configured": false, 00:15:18.362 "data_offset": 0, 00:15:18.362 "data_size": 0 00:15:18.362 }, 00:15:18.362 { 00:15:18.362 "name": "BaseBdev2", 00:15:18.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.362 "is_configured": false, 00:15:18.362 "data_offset": 0, 00:15:18.362 "data_size": 0 00:15:18.362 }, 00:15:18.362 { 00:15:18.362 "name": "BaseBdev3", 00:15:18.362 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:18.362 "is_configured": false, 00:15:18.362 "data_offset": 0, 00:15:18.362 "data_size": 0 00:15:18.362 } 00:15:18.362 ] 00:15:18.362 }' 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:18.362 15:54:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:18.930 15:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:19.188 [2024-06-10 15:54:24.512531] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:19.188 [2024-06-10 15:54:24.512560] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b4120 name Existed_Raid, state configuring 00:15:19.188 15:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:19.448 [2024-06-10 15:54:24.765224] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:19.448 [2024-06-10 15:54:24.765254] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:19.448 [2024-06-10 15:54:24.765262] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:19.448 [2024-06-10 15:54:24.765271] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:19.448 [2024-06-10 15:54:24.765277] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:19.448 [2024-06-10 15:54:24.765286] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:19.448 15:54:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:19.707 [2024-06-10 15:54:25.031509] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:19.707 BaseBdev1 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:19.707 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:19.966 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:20.225 [ 00:15:20.225 { 00:15:20.225 "name": "BaseBdev1", 00:15:20.225 "aliases": [ 00:15:20.225 "3e52470a-caec-470d-942b-671541ac43a5" 00:15:20.225 ], 00:15:20.225 "product_name": "Malloc disk", 00:15:20.225 "block_size": 512, 00:15:20.225 "num_blocks": 65536, 00:15:20.225 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:20.225 "assigned_rate_limits": { 00:15:20.225 "rw_ios_per_sec": 0, 00:15:20.225 "rw_mbytes_per_sec": 0, 00:15:20.225 "r_mbytes_per_sec": 0, 00:15:20.225 "w_mbytes_per_sec": 0 00:15:20.225 }, 00:15:20.225 "claimed": true, 00:15:20.225 "claim_type": "exclusive_write", 00:15:20.225 "zoned": false, 00:15:20.225 "supported_io_types": { 00:15:20.225 "read": true, 00:15:20.225 "write": true, 00:15:20.225 "unmap": true, 00:15:20.225 "write_zeroes": true, 00:15:20.225 "flush": true, 00:15:20.225 "reset": true, 00:15:20.225 "compare": false, 00:15:20.225 "compare_and_write": false, 00:15:20.225 "abort": true, 00:15:20.225 "nvme_admin": false, 00:15:20.225 "nvme_io": false 00:15:20.225 }, 00:15:20.225 "memory_domains": [ 00:15:20.225 { 00:15:20.225 "dma_device_id": "system", 00:15:20.225 "dma_device_type": 1 00:15:20.225 }, 00:15:20.225 { 00:15:20.225 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:20.225 "dma_device_type": 2 00:15:20.225 } 00:15:20.225 ], 00:15:20.225 "driver_specific": {} 00:15:20.225 } 00:15:20.226 ] 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:20.226 "name": "Existed_Raid", 00:15:20.226 "uuid": "0fd55209-f0ef-4328-87e9-194bebf4c7dc", 00:15:20.226 "strip_size_kb": 0, 00:15:20.226 "state": "configuring", 00:15:20.226 "raid_level": "raid1", 00:15:20.226 "superblock": true, 00:15:20.226 "num_base_bdevs": 3, 00:15:20.226 "num_base_bdevs_discovered": 1, 00:15:20.226 "num_base_bdevs_operational": 3, 00:15:20.226 "base_bdevs_list": [ 00:15:20.226 { 00:15:20.226 "name": "BaseBdev1", 00:15:20.226 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:20.226 "is_configured": true, 00:15:20.226 "data_offset": 2048, 00:15:20.226 "data_size": 63488 00:15:20.226 }, 00:15:20.226 { 00:15:20.226 "name": "BaseBdev2", 00:15:20.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.226 "is_configured": false, 00:15:20.226 "data_offset": 0, 00:15:20.226 "data_size": 0 00:15:20.226 }, 00:15:20.226 { 00:15:20.226 "name": "BaseBdev3", 00:15:20.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:20.226 "is_configured": false, 00:15:20.226 "data_offset": 0, 00:15:20.226 "data_size": 0 00:15:20.226 } 00:15:20.226 ] 00:15:20.226 }' 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:20.226 15:54:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:21.162 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:21.162 [2024-06-10 15:54:26.555576] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:21.163 [2024-06-10 15:54:26.555612] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b39b0 name Existed_Raid, state configuring 00:15:21.163 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:21.421 [2024-06-10 15:54:26.812296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:21.421 [2024-06-10 15:54:26.813814] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:15:21.421 [2024-06-10 15:54:26.813844] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:15:21.421 [2024-06-10 15:54:26.813853] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:15:21.421 [2024-06-10 15:54:26.813861] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:21.421 15:54:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:21.683 15:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:21.683 "name": "Existed_Raid", 00:15:21.683 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:21.683 "strip_size_kb": 0, 00:15:21.683 "state": "configuring", 00:15:21.683 "raid_level": "raid1", 00:15:21.683 "superblock": true, 00:15:21.683 "num_base_bdevs": 3, 00:15:21.683 "num_base_bdevs_discovered": 1, 00:15:21.683 "num_base_bdevs_operational": 3, 00:15:21.683 "base_bdevs_list": [ 00:15:21.683 { 00:15:21.683 "name": "BaseBdev1", 00:15:21.683 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:21.683 "is_configured": true, 00:15:21.683 "data_offset": 2048, 00:15:21.683 "data_size": 63488 00:15:21.683 }, 00:15:21.683 { 00:15:21.683 "name": "BaseBdev2", 00:15:21.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.683 "is_configured": false, 00:15:21.683 "data_offset": 0, 00:15:21.683 "data_size": 0 00:15:21.683 }, 00:15:21.683 { 00:15:21.683 "name": "BaseBdev3", 00:15:21.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:21.683 "is_configured": false, 00:15:21.683 "data_offset": 0, 00:15:21.683 "data_size": 0 00:15:21.683 } 00:15:21.683 ] 00:15:21.683 }' 00:15:21.683 15:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:21.683 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:22.251 15:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:22.510 [2024-06-10 15:54:27.930562] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:22.510 BaseBdev2 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:22.510 15:54:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:22.769 15:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:23.028 [ 00:15:23.028 { 00:15:23.028 "name": "BaseBdev2", 00:15:23.028 "aliases": [ 00:15:23.028 "c2f813e9-7dc3-4fd9-9652-db918e1889c5" 00:15:23.028 ], 00:15:23.028 "product_name": "Malloc disk", 00:15:23.028 "block_size": 512, 00:15:23.028 "num_blocks": 65536, 00:15:23.028 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:23.028 "assigned_rate_limits": { 00:15:23.028 "rw_ios_per_sec": 0, 00:15:23.028 "rw_mbytes_per_sec": 0, 00:15:23.028 "r_mbytes_per_sec": 0, 00:15:23.028 "w_mbytes_per_sec": 0 00:15:23.028 }, 00:15:23.028 "claimed": true, 00:15:23.028 "claim_type": "exclusive_write", 00:15:23.028 "zoned": false, 00:15:23.028 "supported_io_types": { 00:15:23.028 "read": true, 00:15:23.028 "write": true, 00:15:23.028 "unmap": true, 00:15:23.028 "write_zeroes": true, 00:15:23.028 "flush": true, 00:15:23.028 "reset": true, 00:15:23.028 "compare": false, 00:15:23.028 "compare_and_write": false, 00:15:23.028 "abort": true, 00:15:23.028 "nvme_admin": false, 00:15:23.028 "nvme_io": false 00:15:23.028 }, 00:15:23.028 "memory_domains": [ 00:15:23.028 { 00:15:23.028 "dma_device_id": "system", 00:15:23.028 "dma_device_type": 1 00:15:23.028 }, 00:15:23.028 { 00:15:23.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:23.028 "dma_device_type": 2 00:15:23.028 } 00:15:23.028 ], 00:15:23.028 "driver_specific": {} 00:15:23.028 } 00:15:23.028 ] 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:23.028 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:23.286 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:23.286 "name": "Existed_Raid", 00:15:23.286 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:23.286 "strip_size_kb": 0, 00:15:23.286 "state": "configuring", 00:15:23.286 "raid_level": "raid1", 00:15:23.287 "superblock": true, 00:15:23.287 "num_base_bdevs": 3, 00:15:23.287 "num_base_bdevs_discovered": 2, 00:15:23.287 "num_base_bdevs_operational": 3, 00:15:23.287 "base_bdevs_list": [ 00:15:23.287 { 00:15:23.287 "name": "BaseBdev1", 00:15:23.287 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:23.287 "is_configured": true, 00:15:23.287 "data_offset": 2048, 00:15:23.287 "data_size": 63488 00:15:23.287 }, 00:15:23.287 { 00:15:23.287 "name": "BaseBdev2", 00:15:23.287 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:23.287 "is_configured": true, 00:15:23.287 "data_offset": 2048, 00:15:23.287 "data_size": 63488 00:15:23.287 }, 00:15:23.287 { 00:15:23.287 "name": "BaseBdev3", 00:15:23.287 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:23.287 "is_configured": false, 00:15:23.287 "data_offset": 0, 00:15:23.287 "data_size": 0 00:15:23.287 } 00:15:23.287 ] 00:15:23.287 }' 00:15:23.287 15:54:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:23.287 15:54:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:23.854 15:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:24.113 [2024-06-10 15:54:29.558254] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:24.113 [2024-06-10 15:54:29.558414] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8b48c0 00:15:24.113 [2024-06-10 15:54:29.558427] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:24.113 [2024-06-10 15:54:29.558604] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x580d30 00:15:24.113 [2024-06-10 15:54:29.558736] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8b48c0 00:15:24.113 [2024-06-10 15:54:29.558745] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8b48c0 00:15:24.113 [2024-06-10 15:54:29.558842] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:24.113 BaseBdev3 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:24.113 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:24.373 15:54:29 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:24.632 [ 00:15:24.632 { 00:15:24.632 "name": "BaseBdev3", 00:15:24.632 "aliases": [ 00:15:24.632 "25851d14-c13f-40af-b98f-8d02610c7bdb" 00:15:24.632 ], 00:15:24.632 "product_name": "Malloc disk", 00:15:24.632 "block_size": 512, 00:15:24.632 "num_blocks": 65536, 00:15:24.632 "uuid": "25851d14-c13f-40af-b98f-8d02610c7bdb", 00:15:24.632 "assigned_rate_limits": { 00:15:24.632 "rw_ios_per_sec": 0, 00:15:24.632 "rw_mbytes_per_sec": 0, 00:15:24.632 "r_mbytes_per_sec": 0, 00:15:24.632 "w_mbytes_per_sec": 0 00:15:24.632 }, 00:15:24.632 "claimed": true, 00:15:24.632 "claim_type": "exclusive_write", 00:15:24.632 "zoned": false, 00:15:24.632 "supported_io_types": { 00:15:24.632 "read": true, 00:15:24.632 "write": true, 00:15:24.632 "unmap": true, 00:15:24.632 "write_zeroes": true, 00:15:24.632 "flush": true, 00:15:24.632 "reset": true, 00:15:24.632 "compare": false, 00:15:24.632 "compare_and_write": false, 00:15:24.632 "abort": true, 00:15:24.632 "nvme_admin": false, 00:15:24.632 "nvme_io": false 00:15:24.632 }, 00:15:24.632 "memory_domains": [ 00:15:24.632 { 00:15:24.632 "dma_device_id": "system", 00:15:24.632 "dma_device_type": 1 00:15:24.632 }, 00:15:24.632 { 00:15:24.632 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:24.632 "dma_device_type": 2 00:15:24.632 } 00:15:24.632 ], 00:15:24.632 "driver_specific": {} 00:15:24.632 } 00:15:24.632 ] 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:24.632 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:24.891 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:24.891 "name": "Existed_Raid", 00:15:24.891 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:24.891 "strip_size_kb": 0, 00:15:24.891 "state": "online", 00:15:24.891 "raid_level": "raid1", 00:15:24.891 "superblock": true, 00:15:24.891 "num_base_bdevs": 3, 00:15:24.891 "num_base_bdevs_discovered": 3, 00:15:24.891 "num_base_bdevs_operational": 3, 00:15:24.891 "base_bdevs_list": [ 00:15:24.891 { 00:15:24.891 "name": "BaseBdev1", 00:15:24.891 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:24.891 "is_configured": true, 00:15:24.891 "data_offset": 2048, 00:15:24.891 "data_size": 63488 00:15:24.891 }, 00:15:24.891 { 00:15:24.891 "name": "BaseBdev2", 00:15:24.891 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:24.891 "is_configured": true, 00:15:24.891 "data_offset": 2048, 00:15:24.891 "data_size": 63488 00:15:24.891 }, 00:15:24.891 { 00:15:24.891 "name": "BaseBdev3", 00:15:24.891 "uuid": "25851d14-c13f-40af-b98f-8d02610c7bdb", 00:15:24.891 "is_configured": true, 00:15:24.891 "data_offset": 2048, 00:15:24.891 "data_size": 63488 00:15:24.891 } 00:15:24.891 ] 00:15:24.891 }' 00:15:24.891 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:24.891 15:54:30 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:25.458 15:54:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:25.716 [2024-06-10 15:54:31.190898] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:25.716 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:25.716 "name": "Existed_Raid", 00:15:25.716 "aliases": [ 00:15:25.716 "81db7daa-be11-4eac-9c7c-927caf907a04" 00:15:25.716 ], 00:15:25.716 "product_name": "Raid Volume", 00:15:25.716 "block_size": 512, 00:15:25.716 "num_blocks": 63488, 00:15:25.716 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:25.716 "assigned_rate_limits": { 00:15:25.716 "rw_ios_per_sec": 0, 00:15:25.716 "rw_mbytes_per_sec": 0, 00:15:25.716 "r_mbytes_per_sec": 0, 00:15:25.716 "w_mbytes_per_sec": 0 00:15:25.716 }, 00:15:25.716 "claimed": false, 00:15:25.716 "zoned": false, 00:15:25.716 "supported_io_types": { 00:15:25.716 "read": true, 00:15:25.716 "write": true, 00:15:25.716 "unmap": false, 00:15:25.716 "write_zeroes": true, 00:15:25.716 "flush": false, 00:15:25.716 "reset": true, 00:15:25.716 "compare": false, 00:15:25.716 "compare_and_write": false, 00:15:25.716 "abort": false, 00:15:25.716 "nvme_admin": false, 00:15:25.716 "nvme_io": false 00:15:25.716 }, 00:15:25.716 "memory_domains": [ 00:15:25.716 { 00:15:25.716 "dma_device_id": "system", 00:15:25.716 "dma_device_type": 1 00:15:25.716 }, 00:15:25.716 { 00:15:25.716 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.716 "dma_device_type": 2 00:15:25.716 }, 00:15:25.717 { 00:15:25.717 "dma_device_id": "system", 00:15:25.717 "dma_device_type": 1 00:15:25.717 }, 00:15:25.717 { 00:15:25.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.717 "dma_device_type": 2 00:15:25.717 }, 00:15:25.717 { 00:15:25.717 "dma_device_id": "system", 00:15:25.717 "dma_device_type": 1 00:15:25.717 }, 00:15:25.717 { 00:15:25.717 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:25.717 "dma_device_type": 2 00:15:25.717 } 00:15:25.717 ], 00:15:25.717 "driver_specific": { 00:15:25.717 "raid": { 00:15:25.717 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:25.717 "strip_size_kb": 0, 00:15:25.717 "state": "online", 00:15:25.717 "raid_level": "raid1", 00:15:25.717 "superblock": true, 00:15:25.717 "num_base_bdevs": 3, 00:15:25.717 "num_base_bdevs_discovered": 3, 00:15:25.717 "num_base_bdevs_operational": 3, 00:15:25.717 "base_bdevs_list": [ 00:15:25.717 { 00:15:25.717 "name": "BaseBdev1", 00:15:25.717 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:25.717 "is_configured": true, 00:15:25.717 "data_offset": 2048, 00:15:25.717 "data_size": 63488 00:15:25.717 }, 00:15:25.717 { 00:15:25.717 "name": "BaseBdev2", 00:15:25.717 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:25.717 "is_configured": true, 00:15:25.717 "data_offset": 2048, 00:15:25.717 "data_size": 63488 00:15:25.717 }, 00:15:25.717 { 00:15:25.717 "name": "BaseBdev3", 00:15:25.717 "uuid": "25851d14-c13f-40af-b98f-8d02610c7bdb", 00:15:25.717 "is_configured": true, 00:15:25.717 "data_offset": 2048, 00:15:25.717 "data_size": 63488 00:15:25.717 } 00:15:25.717 ] 00:15:25.717 } 00:15:25.717 } 00:15:25.717 }' 00:15:25.717 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:25.975 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:15:25.975 BaseBdev2 00:15:25.975 BaseBdev3' 00:15:25.975 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:25.975 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:15:25.975 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.234 "name": "BaseBdev1", 00:15:26.234 "aliases": [ 00:15:26.234 "3e52470a-caec-470d-942b-671541ac43a5" 00:15:26.234 ], 00:15:26.234 "product_name": "Malloc disk", 00:15:26.234 "block_size": 512, 00:15:26.234 "num_blocks": 65536, 00:15:26.234 "uuid": "3e52470a-caec-470d-942b-671541ac43a5", 00:15:26.234 "assigned_rate_limits": { 00:15:26.234 "rw_ios_per_sec": 0, 00:15:26.234 "rw_mbytes_per_sec": 0, 00:15:26.234 "r_mbytes_per_sec": 0, 00:15:26.234 "w_mbytes_per_sec": 0 00:15:26.234 }, 00:15:26.234 "claimed": true, 00:15:26.234 "claim_type": "exclusive_write", 00:15:26.234 "zoned": false, 00:15:26.234 "supported_io_types": { 00:15:26.234 "read": true, 00:15:26.234 "write": true, 00:15:26.234 "unmap": true, 00:15:26.234 "write_zeroes": true, 00:15:26.234 "flush": true, 00:15:26.234 "reset": true, 00:15:26.234 "compare": false, 00:15:26.234 "compare_and_write": false, 00:15:26.234 "abort": true, 00:15:26.234 "nvme_admin": false, 00:15:26.234 "nvme_io": false 00:15:26.234 }, 00:15:26.234 "memory_domains": [ 00:15:26.234 { 00:15:26.234 "dma_device_id": "system", 00:15:26.234 "dma_device_type": 1 00:15:26.234 }, 00:15:26.234 { 00:15:26.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.234 "dma_device_type": 2 00:15:26.234 } 00:15:26.234 ], 00:15:26.234 "driver_specific": {} 00:15:26.234 }' 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.234 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:26.493 15:54:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:26.751 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:26.751 "name": "BaseBdev2", 00:15:26.751 "aliases": [ 00:15:26.751 "c2f813e9-7dc3-4fd9-9652-db918e1889c5" 00:15:26.751 ], 00:15:26.751 "product_name": "Malloc disk", 00:15:26.751 "block_size": 512, 00:15:26.751 "num_blocks": 65536, 00:15:26.751 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:26.751 "assigned_rate_limits": { 00:15:26.751 "rw_ios_per_sec": 0, 00:15:26.751 "rw_mbytes_per_sec": 0, 00:15:26.751 "r_mbytes_per_sec": 0, 00:15:26.751 "w_mbytes_per_sec": 0 00:15:26.751 }, 00:15:26.751 "claimed": true, 00:15:26.751 "claim_type": "exclusive_write", 00:15:26.751 "zoned": false, 00:15:26.751 "supported_io_types": { 00:15:26.751 "read": true, 00:15:26.751 "write": true, 00:15:26.751 "unmap": true, 00:15:26.751 "write_zeroes": true, 00:15:26.751 "flush": true, 00:15:26.751 "reset": true, 00:15:26.751 "compare": false, 00:15:26.751 "compare_and_write": false, 00:15:26.751 "abort": true, 00:15:26.751 "nvme_admin": false, 00:15:26.751 "nvme_io": false 00:15:26.751 }, 00:15:26.751 "memory_domains": [ 00:15:26.751 { 00:15:26.751 "dma_device_id": "system", 00:15:26.751 "dma_device_type": 1 00:15:26.751 }, 00:15:26.752 { 00:15:26.752 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:26.752 "dma_device_type": 2 00:15:26.752 } 00:15:26.752 ], 00:15:26.752 "driver_specific": {} 00:15:26.752 }' 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:26.752 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:27.010 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:27.269 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:27.269 "name": "BaseBdev3", 00:15:27.269 "aliases": [ 00:15:27.269 "25851d14-c13f-40af-b98f-8d02610c7bdb" 00:15:27.269 ], 00:15:27.269 "product_name": "Malloc disk", 00:15:27.269 "block_size": 512, 00:15:27.269 "num_blocks": 65536, 00:15:27.269 "uuid": "25851d14-c13f-40af-b98f-8d02610c7bdb", 00:15:27.269 "assigned_rate_limits": { 00:15:27.269 "rw_ios_per_sec": 0, 00:15:27.269 "rw_mbytes_per_sec": 0, 00:15:27.269 "r_mbytes_per_sec": 0, 00:15:27.269 "w_mbytes_per_sec": 0 00:15:27.269 }, 00:15:27.269 "claimed": true, 00:15:27.269 "claim_type": "exclusive_write", 00:15:27.269 "zoned": false, 00:15:27.269 "supported_io_types": { 00:15:27.269 "read": true, 00:15:27.269 "write": true, 00:15:27.269 "unmap": true, 00:15:27.269 "write_zeroes": true, 00:15:27.269 "flush": true, 00:15:27.269 "reset": true, 00:15:27.269 "compare": false, 00:15:27.269 "compare_and_write": false, 00:15:27.269 "abort": true, 00:15:27.269 "nvme_admin": false, 00:15:27.269 "nvme_io": false 00:15:27.269 }, 00:15:27.269 "memory_domains": [ 00:15:27.269 { 00:15:27.269 "dma_device_id": "system", 00:15:27.269 "dma_device_type": 1 00:15:27.269 }, 00:15:27.269 { 00:15:27.269 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:27.269 "dma_device_type": 2 00:15:27.269 } 00:15:27.269 ], 00:15:27.269 "driver_specific": {} 00:15:27.269 }' 00:15:27.269 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.269 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:27.269 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:27.269 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.527 15:54:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:27.527 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:27.527 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:27.786 [2024-06-10 15:54:33.252249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:27.786 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:28.045 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:28.045 "name": "Existed_Raid", 00:15:28.045 "uuid": "81db7daa-be11-4eac-9c7c-927caf907a04", 00:15:28.045 "strip_size_kb": 0, 00:15:28.045 "state": "online", 00:15:28.045 "raid_level": "raid1", 00:15:28.045 "superblock": true, 00:15:28.045 "num_base_bdevs": 3, 00:15:28.045 "num_base_bdevs_discovered": 2, 00:15:28.045 "num_base_bdevs_operational": 2, 00:15:28.045 "base_bdevs_list": [ 00:15:28.045 { 00:15:28.045 "name": null, 00:15:28.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:28.045 "is_configured": false, 00:15:28.045 "data_offset": 2048, 00:15:28.045 "data_size": 63488 00:15:28.045 }, 00:15:28.045 { 00:15:28.045 "name": "BaseBdev2", 00:15:28.045 "uuid": "c2f813e9-7dc3-4fd9-9652-db918e1889c5", 00:15:28.045 "is_configured": true, 00:15:28.045 "data_offset": 2048, 00:15:28.045 "data_size": 63488 00:15:28.045 }, 00:15:28.045 { 00:15:28.045 "name": "BaseBdev3", 00:15:28.045 "uuid": "25851d14-c13f-40af-b98f-8d02610c7bdb", 00:15:28.045 "is_configured": true, 00:15:28.045 "data_offset": 2048, 00:15:28.045 "data_size": 63488 00:15:28.045 } 00:15:28.045 ] 00:15:28.045 }' 00:15:28.045 15:54:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:28.045 15:54:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:28.982 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:15:29.241 [2024-06-10 15:54:34.645216] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:29.241 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:29.241 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:29.241 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.241 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:15:29.500 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:15:29.500 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:15:29.500 15:54:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:15:29.759 [2024-06-10 15:54:35.065083] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:29.759 [2024-06-10 15:54:35.065165] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:29.759 [2024-06-10 15:54:35.076005] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:29.759 [2024-06-10 15:54:35.076038] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:29.759 [2024-06-10 15:54:35.076047] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b48c0 name Existed_Raid, state offline 00:15:29.759 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:15:29.759 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:15:29.759 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:29.759 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 3 -gt 2 ']' 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.020 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:15:30.279 BaseBdev2 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:30.279 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:30.538 15:54:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:15:30.797 [ 00:15:30.797 { 00:15:30.797 "name": "BaseBdev2", 00:15:30.797 "aliases": [ 00:15:30.797 "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a" 00:15:30.797 ], 00:15:30.797 "product_name": "Malloc disk", 00:15:30.797 "block_size": 512, 00:15:30.797 "num_blocks": 65536, 00:15:30.798 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:30.798 "assigned_rate_limits": { 00:15:30.798 "rw_ios_per_sec": 0, 00:15:30.798 "rw_mbytes_per_sec": 0, 00:15:30.798 "r_mbytes_per_sec": 0, 00:15:30.798 "w_mbytes_per_sec": 0 00:15:30.798 }, 00:15:30.798 "claimed": false, 00:15:30.798 "zoned": false, 00:15:30.798 "supported_io_types": { 00:15:30.798 "read": true, 00:15:30.798 "write": true, 00:15:30.798 "unmap": true, 00:15:30.798 "write_zeroes": true, 00:15:30.798 "flush": true, 00:15:30.798 "reset": true, 00:15:30.798 "compare": false, 00:15:30.798 "compare_and_write": false, 00:15:30.798 "abort": true, 00:15:30.798 "nvme_admin": false, 00:15:30.798 "nvme_io": false 00:15:30.798 }, 00:15:30.798 "memory_domains": [ 00:15:30.798 { 00:15:30.798 "dma_device_id": "system", 00:15:30.798 "dma_device_type": 1 00:15:30.798 }, 00:15:30.798 { 00:15:30.798 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:30.798 "dma_device_type": 2 00:15:30.798 } 00:15:30.798 ], 00:15:30.798 "driver_specific": {} 00:15:30.798 } 00:15:30.798 ] 00:15:30.798 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:30.798 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:30.798 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:30.798 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:15:31.057 BaseBdev3 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:31.057 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:31.315 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:15:31.574 [ 00:15:31.574 { 00:15:31.574 "name": "BaseBdev3", 00:15:31.574 "aliases": [ 00:15:31.574 "cbf03ce5-91a9-44c0-9a5a-daff1c850d02" 00:15:31.574 ], 00:15:31.574 "product_name": "Malloc disk", 00:15:31.574 "block_size": 512, 00:15:31.574 "num_blocks": 65536, 00:15:31.574 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:31.574 "assigned_rate_limits": { 00:15:31.574 "rw_ios_per_sec": 0, 00:15:31.574 "rw_mbytes_per_sec": 0, 00:15:31.574 "r_mbytes_per_sec": 0, 00:15:31.574 "w_mbytes_per_sec": 0 00:15:31.574 }, 00:15:31.574 "claimed": false, 00:15:31.574 "zoned": false, 00:15:31.574 "supported_io_types": { 00:15:31.574 "read": true, 00:15:31.574 "write": true, 00:15:31.574 "unmap": true, 00:15:31.574 "write_zeroes": true, 00:15:31.574 "flush": true, 00:15:31.574 "reset": true, 00:15:31.574 "compare": false, 00:15:31.574 "compare_and_write": false, 00:15:31.574 "abort": true, 00:15:31.574 "nvme_admin": false, 00:15:31.574 "nvme_io": false 00:15:31.574 }, 00:15:31.574 "memory_domains": [ 00:15:31.574 { 00:15:31.574 "dma_device_id": "system", 00:15:31.574 "dma_device_type": 1 00:15:31.574 }, 00:15:31.574 { 00:15:31.574 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:31.574 "dma_device_type": 2 00:15:31.574 } 00:15:31.574 ], 00:15:31.574 "driver_specific": {} 00:15:31.574 } 00:15:31.574 ] 00:15:31.574 15:54:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:31.574 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:15:31.574 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:15:31.574 15:54:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n Existed_Raid 00:15:31.833 [2024-06-10 15:54:37.089796] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:15:31.833 [2024-06-10 15:54:37.089832] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:15:31.833 [2024-06-10 15:54:37.089849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:31.833 [2024-06-10 15:54:37.091239] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:31.833 "name": "Existed_Raid", 00:15:31.833 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:31.833 "strip_size_kb": 0, 00:15:31.833 "state": "configuring", 00:15:31.833 "raid_level": "raid1", 00:15:31.833 "superblock": true, 00:15:31.833 "num_base_bdevs": 3, 00:15:31.833 "num_base_bdevs_discovered": 2, 00:15:31.833 "num_base_bdevs_operational": 3, 00:15:31.833 "base_bdevs_list": [ 00:15:31.833 { 00:15:31.833 "name": "BaseBdev1", 00:15:31.833 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:31.833 "is_configured": false, 00:15:31.833 "data_offset": 0, 00:15:31.833 "data_size": 0 00:15:31.833 }, 00:15:31.833 { 00:15:31.833 "name": "BaseBdev2", 00:15:31.833 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:31.833 "is_configured": true, 00:15:31.833 "data_offset": 2048, 00:15:31.833 "data_size": 63488 00:15:31.833 }, 00:15:31.833 { 00:15:31.833 "name": "BaseBdev3", 00:15:31.833 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:31.833 "is_configured": true, 00:15:31.833 "data_offset": 2048, 00:15:31.833 "data_size": 63488 00:15:31.833 } 00:15:31.833 ] 00:15:31.833 }' 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:31.833 15:54:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:32.402 15:54:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:15:32.661 [2024-06-10 15:54:38.020257] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:32.661 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:32.921 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:32.921 "name": "Existed_Raid", 00:15:32.921 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:32.921 "strip_size_kb": 0, 00:15:32.921 "state": "configuring", 00:15:32.921 "raid_level": "raid1", 00:15:32.921 "superblock": true, 00:15:32.921 "num_base_bdevs": 3, 00:15:32.921 "num_base_bdevs_discovered": 1, 00:15:32.921 "num_base_bdevs_operational": 3, 00:15:32.921 "base_bdevs_list": [ 00:15:32.921 { 00:15:32.921 "name": "BaseBdev1", 00:15:32.921 "uuid": "00000000-0000-0000-0000-000000000000", 00:15:32.921 "is_configured": false, 00:15:32.921 "data_offset": 0, 00:15:32.921 "data_size": 0 00:15:32.921 }, 00:15:32.921 { 00:15:32.921 "name": null, 00:15:32.921 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:32.921 "is_configured": false, 00:15:32.921 "data_offset": 2048, 00:15:32.921 "data_size": 63488 00:15:32.921 }, 00:15:32.921 { 00:15:32.921 "name": "BaseBdev3", 00:15:32.921 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:32.921 "is_configured": true, 00:15:32.921 "data_offset": 2048, 00:15:32.921 "data_size": 63488 00:15:32.921 } 00:15:32.921 ] 00:15:32.921 }' 00:15:32.921 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:32.921 15:54:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:33.489 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:33.489 15:54:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:33.749 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:15:33.749 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:15:34.008 [2024-06-10 15:54:39.427296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:15:34.008 BaseBdev1 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:34.008 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:34.266 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:15:34.525 [ 00:15:34.525 { 00:15:34.525 "name": "BaseBdev1", 00:15:34.525 "aliases": [ 00:15:34.525 "2b1a4f2d-b86f-43e6-8d90-10e31d376787" 00:15:34.525 ], 00:15:34.525 "product_name": "Malloc disk", 00:15:34.525 "block_size": 512, 00:15:34.525 "num_blocks": 65536, 00:15:34.525 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:34.525 "assigned_rate_limits": { 00:15:34.525 "rw_ios_per_sec": 0, 00:15:34.525 "rw_mbytes_per_sec": 0, 00:15:34.525 "r_mbytes_per_sec": 0, 00:15:34.525 "w_mbytes_per_sec": 0 00:15:34.525 }, 00:15:34.525 "claimed": true, 00:15:34.525 "claim_type": "exclusive_write", 00:15:34.525 "zoned": false, 00:15:34.525 "supported_io_types": { 00:15:34.525 "read": true, 00:15:34.525 "write": true, 00:15:34.525 "unmap": true, 00:15:34.525 "write_zeroes": true, 00:15:34.525 "flush": true, 00:15:34.525 "reset": true, 00:15:34.525 "compare": false, 00:15:34.525 "compare_and_write": false, 00:15:34.525 "abort": true, 00:15:34.525 "nvme_admin": false, 00:15:34.525 "nvme_io": false 00:15:34.525 }, 00:15:34.525 "memory_domains": [ 00:15:34.525 { 00:15:34.525 "dma_device_id": "system", 00:15:34.525 "dma_device_type": 1 00:15:34.525 }, 00:15:34.525 { 00:15:34.525 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:34.525 "dma_device_type": 2 00:15:34.525 } 00:15:34.525 ], 00:15:34.525 "driver_specific": {} 00:15:34.525 } 00:15:34.525 ] 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:34.525 15:54:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:34.784 15:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:34.784 "name": "Existed_Raid", 00:15:34.784 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:34.784 "strip_size_kb": 0, 00:15:34.784 "state": "configuring", 00:15:34.784 "raid_level": "raid1", 00:15:34.784 "superblock": true, 00:15:34.784 "num_base_bdevs": 3, 00:15:34.784 "num_base_bdevs_discovered": 2, 00:15:34.784 "num_base_bdevs_operational": 3, 00:15:34.784 "base_bdevs_list": [ 00:15:34.784 { 00:15:34.784 "name": "BaseBdev1", 00:15:34.784 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:34.784 "is_configured": true, 00:15:34.784 "data_offset": 2048, 00:15:34.784 "data_size": 63488 00:15:34.784 }, 00:15:34.784 { 00:15:34.784 "name": null, 00:15:34.784 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:34.784 "is_configured": false, 00:15:34.784 "data_offset": 2048, 00:15:34.784 "data_size": 63488 00:15:34.784 }, 00:15:34.784 { 00:15:34.784 "name": "BaseBdev3", 00:15:34.784 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:34.784 "is_configured": true, 00:15:34.784 "data_offset": 2048, 00:15:34.784 "data_size": 63488 00:15:34.784 } 00:15:34.784 ] 00:15:34.784 }' 00:15:34.784 15:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:34.784 15:54:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:35.721 15:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:35.721 15:54:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:35.721 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:15:35.721 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:15:36.014 [2024-06-10 15:54:41.356508] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.014 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:36.273 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:36.273 "name": "Existed_Raid", 00:15:36.273 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:36.273 "strip_size_kb": 0, 00:15:36.273 "state": "configuring", 00:15:36.273 "raid_level": "raid1", 00:15:36.273 "superblock": true, 00:15:36.273 "num_base_bdevs": 3, 00:15:36.273 "num_base_bdevs_discovered": 1, 00:15:36.273 "num_base_bdevs_operational": 3, 00:15:36.273 "base_bdevs_list": [ 00:15:36.273 { 00:15:36.273 "name": "BaseBdev1", 00:15:36.273 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:36.273 "is_configured": true, 00:15:36.273 "data_offset": 2048, 00:15:36.273 "data_size": 63488 00:15:36.273 }, 00:15:36.273 { 00:15:36.273 "name": null, 00:15:36.273 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:36.273 "is_configured": false, 00:15:36.273 "data_offset": 2048, 00:15:36.273 "data_size": 63488 00:15:36.273 }, 00:15:36.273 { 00:15:36.273 "name": null, 00:15:36.273 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:36.273 "is_configured": false, 00:15:36.273 "data_offset": 2048, 00:15:36.273 "data_size": 63488 00:15:36.273 } 00:15:36.273 ] 00:15:36.273 }' 00:15:36.273 15:54:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:36.273 15:54:41 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:36.842 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:36.842 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:37.101 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:15:37.101 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:15:37.360 [2024-06-10 15:54:42.712180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:37.360 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:37.618 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:37.618 "name": "Existed_Raid", 00:15:37.618 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:37.618 "strip_size_kb": 0, 00:15:37.618 "state": "configuring", 00:15:37.618 "raid_level": "raid1", 00:15:37.618 "superblock": true, 00:15:37.618 "num_base_bdevs": 3, 00:15:37.618 "num_base_bdevs_discovered": 2, 00:15:37.618 "num_base_bdevs_operational": 3, 00:15:37.618 "base_bdevs_list": [ 00:15:37.618 { 00:15:37.618 "name": "BaseBdev1", 00:15:37.618 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:37.618 "is_configured": true, 00:15:37.618 "data_offset": 2048, 00:15:37.618 "data_size": 63488 00:15:37.618 }, 00:15:37.618 { 00:15:37.618 "name": null, 00:15:37.618 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:37.618 "is_configured": false, 00:15:37.618 "data_offset": 2048, 00:15:37.618 "data_size": 63488 00:15:37.618 }, 00:15:37.618 { 00:15:37.618 "name": "BaseBdev3", 00:15:37.618 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:37.618 "is_configured": true, 00:15:37.618 "data_offset": 2048, 00:15:37.618 "data_size": 63488 00:15:37.618 } 00:15:37.618 ] 00:15:37.618 }' 00:15:37.618 15:54:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:37.618 15:54:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:38.186 15:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.186 15:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:15:38.445 15:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:15:38.445 15:54:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:15:38.704 [2024-06-10 15:54:44.099916] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:38.704 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:38.963 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:38.963 "name": "Existed_Raid", 00:15:38.963 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:38.963 "strip_size_kb": 0, 00:15:38.963 "state": "configuring", 00:15:38.963 "raid_level": "raid1", 00:15:38.963 "superblock": true, 00:15:38.963 "num_base_bdevs": 3, 00:15:38.963 "num_base_bdevs_discovered": 1, 00:15:38.963 "num_base_bdevs_operational": 3, 00:15:38.963 "base_bdevs_list": [ 00:15:38.963 { 00:15:38.963 "name": null, 00:15:38.963 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:38.963 "is_configured": false, 00:15:38.963 "data_offset": 2048, 00:15:38.963 "data_size": 63488 00:15:38.963 }, 00:15:38.963 { 00:15:38.963 "name": null, 00:15:38.963 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:38.963 "is_configured": false, 00:15:38.963 "data_offset": 2048, 00:15:38.963 "data_size": 63488 00:15:38.963 }, 00:15:38.963 { 00:15:38.963 "name": "BaseBdev3", 00:15:38.963 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:38.963 "is_configured": true, 00:15:38.963 "data_offset": 2048, 00:15:38.963 "data_size": 63488 00:15:38.963 } 00:15:38.963 ] 00:15:38.963 }' 00:15:38.963 15:54:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:38.963 15:54:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:39.530 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:39.530 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:15:39.789 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:15:39.789 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:15:40.048 [2024-06-10 15:54:45.502117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 3 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:40.048 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:40.306 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:40.306 "name": "Existed_Raid", 00:15:40.306 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:40.306 "strip_size_kb": 0, 00:15:40.306 "state": "configuring", 00:15:40.306 "raid_level": "raid1", 00:15:40.306 "superblock": true, 00:15:40.306 "num_base_bdevs": 3, 00:15:40.306 "num_base_bdevs_discovered": 2, 00:15:40.306 "num_base_bdevs_operational": 3, 00:15:40.306 "base_bdevs_list": [ 00:15:40.306 { 00:15:40.306 "name": null, 00:15:40.306 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:40.306 "is_configured": false, 00:15:40.306 "data_offset": 2048, 00:15:40.306 "data_size": 63488 00:15:40.306 }, 00:15:40.306 { 00:15:40.306 "name": "BaseBdev2", 00:15:40.306 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:40.306 "is_configured": true, 00:15:40.306 "data_offset": 2048, 00:15:40.306 "data_size": 63488 00:15:40.306 }, 00:15:40.306 { 00:15:40.306 "name": "BaseBdev3", 00:15:40.306 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:40.306 "is_configured": true, 00:15:40.306 "data_offset": 2048, 00:15:40.306 "data_size": 63488 00:15:40.306 } 00:15:40.306 ] 00:15:40.306 }' 00:15:40.306 15:54:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:40.306 15:54:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:41.243 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.243 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:15:41.243 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:15:41.243 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:41.243 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:15:41.502 15:54:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 2b1a4f2d-b86f-43e6-8d90-10e31d376787 00:15:41.761 [2024-06-10 15:54:47.065482] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:15:41.761 [2024-06-10 15:54:47.065628] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8b2e40 00:15:41.761 [2024-06-10 15:54:47.065639] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:41.761 [2024-06-10 15:54:47.065818] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x580d30 00:15:41.761 [2024-06-10 15:54:47.065941] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8b2e40 00:15:41.761 [2024-06-10 15:54:47.065949] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8b2e40 00:15:41.761 [2024-06-10 15:54:47.066054] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:41.761 NewBaseBdev 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:15:41.761 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:15:42.020 [ 00:15:42.020 { 00:15:42.020 "name": "NewBaseBdev", 00:15:42.020 "aliases": [ 00:15:42.020 "2b1a4f2d-b86f-43e6-8d90-10e31d376787" 00:15:42.020 ], 00:15:42.020 "product_name": "Malloc disk", 00:15:42.020 "block_size": 512, 00:15:42.020 "num_blocks": 65536, 00:15:42.020 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:42.020 "assigned_rate_limits": { 00:15:42.020 "rw_ios_per_sec": 0, 00:15:42.020 "rw_mbytes_per_sec": 0, 00:15:42.020 "r_mbytes_per_sec": 0, 00:15:42.020 "w_mbytes_per_sec": 0 00:15:42.020 }, 00:15:42.020 "claimed": true, 00:15:42.020 "claim_type": "exclusive_write", 00:15:42.020 "zoned": false, 00:15:42.020 "supported_io_types": { 00:15:42.020 "read": true, 00:15:42.020 "write": true, 00:15:42.020 "unmap": true, 00:15:42.020 "write_zeroes": true, 00:15:42.020 "flush": true, 00:15:42.020 "reset": true, 00:15:42.020 "compare": false, 00:15:42.020 "compare_and_write": false, 00:15:42.020 "abort": true, 00:15:42.020 "nvme_admin": false, 00:15:42.020 "nvme_io": false 00:15:42.020 }, 00:15:42.020 "memory_domains": [ 00:15:42.020 { 00:15:42.020 "dma_device_id": "system", 00:15:42.020 "dma_device_type": 1 00:15:42.020 }, 00:15:42.020 { 00:15:42.020 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:42.020 "dma_device_type": 2 00:15:42.020 } 00:15:42.020 ], 00:15:42.020 "driver_specific": {} 00:15:42.020 } 00:15:42.020 ] 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:42.020 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:15:42.279 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:42.279 "name": "Existed_Raid", 00:15:42.279 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:42.279 "strip_size_kb": 0, 00:15:42.279 "state": "online", 00:15:42.279 "raid_level": "raid1", 00:15:42.279 "superblock": true, 00:15:42.279 "num_base_bdevs": 3, 00:15:42.279 "num_base_bdevs_discovered": 3, 00:15:42.279 "num_base_bdevs_operational": 3, 00:15:42.279 "base_bdevs_list": [ 00:15:42.279 { 00:15:42.279 "name": "NewBaseBdev", 00:15:42.279 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:42.279 "is_configured": true, 00:15:42.279 "data_offset": 2048, 00:15:42.279 "data_size": 63488 00:15:42.279 }, 00:15:42.279 { 00:15:42.279 "name": "BaseBdev2", 00:15:42.279 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:42.279 "is_configured": true, 00:15:42.279 "data_offset": 2048, 00:15:42.279 "data_size": 63488 00:15:42.279 }, 00:15:42.279 { 00:15:42.279 "name": "BaseBdev3", 00:15:42.279 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:42.279 "is_configured": true, 00:15:42.279 "data_offset": 2048, 00:15:42.279 "data_size": 63488 00:15:42.279 } 00:15:42.279 ] 00:15:42.279 }' 00:15:42.279 15:54:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:42.280 15:54:47 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:15:42.848 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:43.107 [2024-06-10 15:54:48.493580] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:43.107 "name": "Existed_Raid", 00:15:43.107 "aliases": [ 00:15:43.107 "2e6c39c7-24ad-4dbb-a378-4f8db84e1400" 00:15:43.107 ], 00:15:43.107 "product_name": "Raid Volume", 00:15:43.107 "block_size": 512, 00:15:43.107 "num_blocks": 63488, 00:15:43.107 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:43.107 "assigned_rate_limits": { 00:15:43.107 "rw_ios_per_sec": 0, 00:15:43.107 "rw_mbytes_per_sec": 0, 00:15:43.107 "r_mbytes_per_sec": 0, 00:15:43.107 "w_mbytes_per_sec": 0 00:15:43.107 }, 00:15:43.107 "claimed": false, 00:15:43.107 "zoned": false, 00:15:43.107 "supported_io_types": { 00:15:43.107 "read": true, 00:15:43.107 "write": true, 00:15:43.107 "unmap": false, 00:15:43.107 "write_zeroes": true, 00:15:43.107 "flush": false, 00:15:43.107 "reset": true, 00:15:43.107 "compare": false, 00:15:43.107 "compare_and_write": false, 00:15:43.107 "abort": false, 00:15:43.107 "nvme_admin": false, 00:15:43.107 "nvme_io": false 00:15:43.107 }, 00:15:43.107 "memory_domains": [ 00:15:43.107 { 00:15:43.107 "dma_device_id": "system", 00:15:43.107 "dma_device_type": 1 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.107 "dma_device_type": 2 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "dma_device_id": "system", 00:15:43.107 "dma_device_type": 1 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.107 "dma_device_type": 2 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "dma_device_id": "system", 00:15:43.107 "dma_device_type": 1 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.107 "dma_device_type": 2 00:15:43.107 } 00:15:43.107 ], 00:15:43.107 "driver_specific": { 00:15:43.107 "raid": { 00:15:43.107 "uuid": "2e6c39c7-24ad-4dbb-a378-4f8db84e1400", 00:15:43.107 "strip_size_kb": 0, 00:15:43.107 "state": "online", 00:15:43.107 "raid_level": "raid1", 00:15:43.107 "superblock": true, 00:15:43.107 "num_base_bdevs": 3, 00:15:43.107 "num_base_bdevs_discovered": 3, 00:15:43.107 "num_base_bdevs_operational": 3, 00:15:43.107 "base_bdevs_list": [ 00:15:43.107 { 00:15:43.107 "name": "NewBaseBdev", 00:15:43.107 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:43.107 "is_configured": true, 00:15:43.107 "data_offset": 2048, 00:15:43.107 "data_size": 63488 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "name": "BaseBdev2", 00:15:43.107 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:43.107 "is_configured": true, 00:15:43.107 "data_offset": 2048, 00:15:43.107 "data_size": 63488 00:15:43.107 }, 00:15:43.107 { 00:15:43.107 "name": "BaseBdev3", 00:15:43.107 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:43.107 "is_configured": true, 00:15:43.107 "data_offset": 2048, 00:15:43.107 "data_size": 63488 00:15:43.107 } 00:15:43.107 ] 00:15:43.107 } 00:15:43.107 } 00:15:43.107 }' 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:15:43.107 BaseBdev2 00:15:43.107 BaseBdev3' 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:15:43.107 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:43.366 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:43.366 "name": "NewBaseBdev", 00:15:43.366 "aliases": [ 00:15:43.366 "2b1a4f2d-b86f-43e6-8d90-10e31d376787" 00:15:43.366 ], 00:15:43.366 "product_name": "Malloc disk", 00:15:43.366 "block_size": 512, 00:15:43.366 "num_blocks": 65536, 00:15:43.366 "uuid": "2b1a4f2d-b86f-43e6-8d90-10e31d376787", 00:15:43.366 "assigned_rate_limits": { 00:15:43.366 "rw_ios_per_sec": 0, 00:15:43.366 "rw_mbytes_per_sec": 0, 00:15:43.366 "r_mbytes_per_sec": 0, 00:15:43.366 "w_mbytes_per_sec": 0 00:15:43.366 }, 00:15:43.366 "claimed": true, 00:15:43.366 "claim_type": "exclusive_write", 00:15:43.366 "zoned": false, 00:15:43.366 "supported_io_types": { 00:15:43.366 "read": true, 00:15:43.366 "write": true, 00:15:43.366 "unmap": true, 00:15:43.366 "write_zeroes": true, 00:15:43.366 "flush": true, 00:15:43.366 "reset": true, 00:15:43.366 "compare": false, 00:15:43.366 "compare_and_write": false, 00:15:43.366 "abort": true, 00:15:43.366 "nvme_admin": false, 00:15:43.366 "nvme_io": false 00:15:43.366 }, 00:15:43.366 "memory_domains": [ 00:15:43.366 { 00:15:43.366 "dma_device_id": "system", 00:15:43.366 "dma_device_type": 1 00:15:43.366 }, 00:15:43.366 { 00:15:43.366 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:43.366 "dma_device_type": 2 00:15:43.366 } 00:15:43.366 ], 00:15:43.366 "driver_specific": {} 00:15:43.366 }' 00:15:43.366 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.625 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:43.625 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:43.625 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.625 15:54:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:43.625 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:43.625 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.625 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:43.625 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:43.625 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.884 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:43.884 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:43.884 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:43.884 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:15:43.884 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.143 "name": "BaseBdev2", 00:15:44.143 "aliases": [ 00:15:44.143 "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a" 00:15:44.143 ], 00:15:44.143 "product_name": "Malloc disk", 00:15:44.143 "block_size": 512, 00:15:44.143 "num_blocks": 65536, 00:15:44.143 "uuid": "ccfdfed5-ce0b-4348-b40f-f0035cd7a93a", 00:15:44.143 "assigned_rate_limits": { 00:15:44.143 "rw_ios_per_sec": 0, 00:15:44.143 "rw_mbytes_per_sec": 0, 00:15:44.143 "r_mbytes_per_sec": 0, 00:15:44.143 "w_mbytes_per_sec": 0 00:15:44.143 }, 00:15:44.143 "claimed": true, 00:15:44.143 "claim_type": "exclusive_write", 00:15:44.143 "zoned": false, 00:15:44.143 "supported_io_types": { 00:15:44.143 "read": true, 00:15:44.143 "write": true, 00:15:44.143 "unmap": true, 00:15:44.143 "write_zeroes": true, 00:15:44.143 "flush": true, 00:15:44.143 "reset": true, 00:15:44.143 "compare": false, 00:15:44.143 "compare_and_write": false, 00:15:44.143 "abort": true, 00:15:44.143 "nvme_admin": false, 00:15:44.143 "nvme_io": false 00:15:44.143 }, 00:15:44.143 "memory_domains": [ 00:15:44.143 { 00:15:44.143 "dma_device_id": "system", 00:15:44.143 "dma_device_type": 1 00:15:44.143 }, 00:15:44.143 { 00:15:44.143 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.143 "dma_device_type": 2 00:15:44.143 } 00:15:44.143 ], 00:15:44.143 "driver_specific": {} 00:15:44.143 }' 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.143 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:15:44.403 15:54:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:44.662 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:44.662 "name": "BaseBdev3", 00:15:44.662 "aliases": [ 00:15:44.662 "cbf03ce5-91a9-44c0-9a5a-daff1c850d02" 00:15:44.662 ], 00:15:44.662 "product_name": "Malloc disk", 00:15:44.662 "block_size": 512, 00:15:44.662 "num_blocks": 65536, 00:15:44.662 "uuid": "cbf03ce5-91a9-44c0-9a5a-daff1c850d02", 00:15:44.662 "assigned_rate_limits": { 00:15:44.662 "rw_ios_per_sec": 0, 00:15:44.662 "rw_mbytes_per_sec": 0, 00:15:44.662 "r_mbytes_per_sec": 0, 00:15:44.662 "w_mbytes_per_sec": 0 00:15:44.662 }, 00:15:44.662 "claimed": true, 00:15:44.662 "claim_type": "exclusive_write", 00:15:44.662 "zoned": false, 00:15:44.662 "supported_io_types": { 00:15:44.662 "read": true, 00:15:44.662 "write": true, 00:15:44.662 "unmap": true, 00:15:44.662 "write_zeroes": true, 00:15:44.662 "flush": true, 00:15:44.662 "reset": true, 00:15:44.662 "compare": false, 00:15:44.662 "compare_and_write": false, 00:15:44.662 "abort": true, 00:15:44.662 "nvme_admin": false, 00:15:44.662 "nvme_io": false 00:15:44.662 }, 00:15:44.662 "memory_domains": [ 00:15:44.662 { 00:15:44.662 "dma_device_id": "system", 00:15:44.662 "dma_device_type": 1 00:15:44.662 }, 00:15:44.662 { 00:15:44.662 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:44.662 "dma_device_type": 2 00:15:44.662 } 00:15:44.662 ], 00:15:44.662 "driver_specific": {} 00:15:44.662 }' 00:15:44.662 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.662 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:44.921 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:45.180 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:45.180 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:15:45.439 [2024-06-10 15:54:50.703249] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:15:45.439 [2024-06-10 15:54:50.703274] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:45.439 [2024-06-10 15:54:50.703325] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:45.439 [2024-06-10 15:54:50.703594] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:45.439 [2024-06-10 15:54:50.703604] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8b2e40 name Existed_Raid, state offline 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2694727 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2694727 ']' 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2694727 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2694727 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2694727' 00:15:45.439 killing process with pid 2694727 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2694727 00:15:45.439 [2024-06-10 15:54:50.771442] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:15:45.439 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2694727 00:15:45.439 [2024-06-10 15:54:50.796485] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:15:45.698 15:54:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:15:45.698 00:15:45.698 real 0m28.660s 00:15:45.698 user 0m53.839s 00:15:45.698 sys 0m3.899s 00:15:45.698 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:15:45.698 15:54:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:15:45.698 ************************************ 00:15:45.698 END TEST raid_state_function_test_sb 00:15:45.698 ************************************ 00:15:45.698 15:54:51 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 3 00:15:45.698 15:54:51 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:15:45.698 15:54:51 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:15:45.698 15:54:51 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:15:45.698 ************************************ 00:15:45.698 START TEST raid_superblock_test 00:15:45.698 ************************************ 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 3 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=3 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2700088 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2700088 /var/tmp/spdk-raid.sock 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2700088 ']' 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:15:45.698 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:15:45.698 15:54:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:45.698 [2024-06-10 15:54:51.119739] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:15:45.698 [2024-06-10 15:54:51.119791] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2700088 ] 00:15:45.957 [2024-06-10 15:54:51.219179] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:15:45.957 [2024-06-10 15:54:51.312591] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:15:45.957 [2024-06-10 15:54:51.373560] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:45.957 [2024-06-10 15:54:51.373593] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:15:46.892 malloc1 00:15:46.892 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:47.150 [2024-06-10 15:54:52.588326] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:47.150 [2024-06-10 15:54:52.588371] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.150 [2024-06-10 15:54:52.588390] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x24710f0 00:15:47.150 [2024-06-10 15:54:52.588400] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.150 [2024-06-10 15:54:52.590128] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.150 [2024-06-10 15:54:52.590155] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:47.150 pt1 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:47.150 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:15:47.409 malloc2 00:15:47.409 15:54:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:47.668 [2024-06-10 15:54:53.102558] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:47.668 [2024-06-10 15:54:53.102603] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:47.668 [2024-06-10 15:54:53.102619] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2472400 00:15:47.668 [2024-06-10 15:54:53.102628] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:47.668 [2024-06-10 15:54:53.104194] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:47.668 [2024-06-10 15:54:53.104222] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:47.668 pt2 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:15:47.668 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:15:47.927 malloc3 00:15:47.927 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:48.186 [2024-06-10 15:54:53.624575] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:48.186 [2024-06-10 15:54:53.624619] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:48.186 [2024-06-10 15:54:53.624637] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261e200 00:15:48.186 [2024-06-10 15:54:53.624646] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:48.186 [2024-06-10 15:54:53.626229] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:48.186 [2024-06-10 15:54:53.626259] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:48.186 pt3 00:15:48.186 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:15:48.186 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:15:48.186 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3' -n raid_bdev1 -s 00:15:48.445 [2024-06-10 15:54:53.881265] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:48.445 [2024-06-10 15:54:53.882609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:48.445 [2024-06-10 15:54:53.882668] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:48.445 [2024-06-10 15:54:53.882834] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x261c9f0 00:15:48.445 [2024-06-10 15:54:53.882847] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:48.445 [2024-06-10 15:54:53.883058] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x261d830 00:15:48.445 [2024-06-10 15:54:53.883212] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x261c9f0 00:15:48.445 [2024-06-10 15:54:53.883221] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x261c9f0 00:15:48.445 [2024-06-10 15:54:53.883322] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:48.445 15:54:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:48.704 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:48.704 "name": "raid_bdev1", 00:15:48.704 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:48.704 "strip_size_kb": 0, 00:15:48.704 "state": "online", 00:15:48.704 "raid_level": "raid1", 00:15:48.704 "superblock": true, 00:15:48.704 "num_base_bdevs": 3, 00:15:48.704 "num_base_bdevs_discovered": 3, 00:15:48.704 "num_base_bdevs_operational": 3, 00:15:48.704 "base_bdevs_list": [ 00:15:48.704 { 00:15:48.704 "name": "pt1", 00:15:48.704 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:48.704 "is_configured": true, 00:15:48.704 "data_offset": 2048, 00:15:48.704 "data_size": 63488 00:15:48.704 }, 00:15:48.704 { 00:15:48.704 "name": "pt2", 00:15:48.704 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:48.704 "is_configured": true, 00:15:48.704 "data_offset": 2048, 00:15:48.704 "data_size": 63488 00:15:48.704 }, 00:15:48.704 { 00:15:48.704 "name": "pt3", 00:15:48.704 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:48.704 "is_configured": true, 00:15:48.704 "data_offset": 2048, 00:15:48.704 "data_size": 63488 00:15:48.704 } 00:15:48.704 ] 00:15:48.704 }' 00:15:48.704 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:48.704 15:54:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:49.640 [2024-06-10 15:54:54.948356] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:49.640 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:49.640 "name": "raid_bdev1", 00:15:49.640 "aliases": [ 00:15:49.640 "4b24eb3e-7a6e-4346-914a-11501685e5c0" 00:15:49.640 ], 00:15:49.640 "product_name": "Raid Volume", 00:15:49.640 "block_size": 512, 00:15:49.640 "num_blocks": 63488, 00:15:49.640 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:49.640 "assigned_rate_limits": { 00:15:49.640 "rw_ios_per_sec": 0, 00:15:49.640 "rw_mbytes_per_sec": 0, 00:15:49.640 "r_mbytes_per_sec": 0, 00:15:49.640 "w_mbytes_per_sec": 0 00:15:49.640 }, 00:15:49.640 "claimed": false, 00:15:49.640 "zoned": false, 00:15:49.640 "supported_io_types": { 00:15:49.640 "read": true, 00:15:49.640 "write": true, 00:15:49.640 "unmap": false, 00:15:49.640 "write_zeroes": true, 00:15:49.640 "flush": false, 00:15:49.640 "reset": true, 00:15:49.640 "compare": false, 00:15:49.640 "compare_and_write": false, 00:15:49.640 "abort": false, 00:15:49.640 "nvme_admin": false, 00:15:49.640 "nvme_io": false 00:15:49.640 }, 00:15:49.641 "memory_domains": [ 00:15:49.641 { 00:15:49.641 "dma_device_id": "system", 00:15:49.641 "dma_device_type": 1 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.641 "dma_device_type": 2 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "dma_device_id": "system", 00:15:49.641 "dma_device_type": 1 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.641 "dma_device_type": 2 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "dma_device_id": "system", 00:15:49.641 "dma_device_type": 1 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.641 "dma_device_type": 2 00:15:49.641 } 00:15:49.641 ], 00:15:49.641 "driver_specific": { 00:15:49.641 "raid": { 00:15:49.641 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:49.641 "strip_size_kb": 0, 00:15:49.641 "state": "online", 00:15:49.641 "raid_level": "raid1", 00:15:49.641 "superblock": true, 00:15:49.641 "num_base_bdevs": 3, 00:15:49.641 "num_base_bdevs_discovered": 3, 00:15:49.641 "num_base_bdevs_operational": 3, 00:15:49.641 "base_bdevs_list": [ 00:15:49.641 { 00:15:49.641 "name": "pt1", 00:15:49.641 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:49.641 "is_configured": true, 00:15:49.641 "data_offset": 2048, 00:15:49.641 "data_size": 63488 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "name": "pt2", 00:15:49.641 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:49.641 "is_configured": true, 00:15:49.641 "data_offset": 2048, 00:15:49.641 "data_size": 63488 00:15:49.641 }, 00:15:49.641 { 00:15:49.641 "name": "pt3", 00:15:49.641 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:49.641 "is_configured": true, 00:15:49.641 "data_offset": 2048, 00:15:49.641 "data_size": 63488 00:15:49.641 } 00:15:49.641 ] 00:15:49.641 } 00:15:49.641 } 00:15:49.641 }' 00:15:49.641 15:54:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:49.641 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:49.641 pt2 00:15:49.641 pt3' 00:15:49.641 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:49.641 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:49.641 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:49.900 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:49.900 "name": "pt1", 00:15:49.900 "aliases": [ 00:15:49.900 "00000000-0000-0000-0000-000000000001" 00:15:49.900 ], 00:15:49.900 "product_name": "passthru", 00:15:49.900 "block_size": 512, 00:15:49.900 "num_blocks": 65536, 00:15:49.900 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:49.900 "assigned_rate_limits": { 00:15:49.900 "rw_ios_per_sec": 0, 00:15:49.900 "rw_mbytes_per_sec": 0, 00:15:49.900 "r_mbytes_per_sec": 0, 00:15:49.900 "w_mbytes_per_sec": 0 00:15:49.900 }, 00:15:49.900 "claimed": true, 00:15:49.900 "claim_type": "exclusive_write", 00:15:49.900 "zoned": false, 00:15:49.900 "supported_io_types": { 00:15:49.900 "read": true, 00:15:49.900 "write": true, 00:15:49.900 "unmap": true, 00:15:49.900 "write_zeroes": true, 00:15:49.900 "flush": true, 00:15:49.900 "reset": true, 00:15:49.900 "compare": false, 00:15:49.900 "compare_and_write": false, 00:15:49.900 "abort": true, 00:15:49.900 "nvme_admin": false, 00:15:49.900 "nvme_io": false 00:15:49.900 }, 00:15:49.900 "memory_domains": [ 00:15:49.900 { 00:15:49.900 "dma_device_id": "system", 00:15:49.900 "dma_device_type": 1 00:15:49.900 }, 00:15:49.900 { 00:15:49.900 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:49.900 "dma_device_type": 2 00:15:49.900 } 00:15:49.900 ], 00:15:49.900 "driver_specific": { 00:15:49.900 "passthru": { 00:15:49.900 "name": "pt1", 00:15:49.900 "base_bdev_name": "malloc1" 00:15:49.900 } 00:15:49.900 } 00:15:49.900 }' 00:15:49.900 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:49.900 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:49.900 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:49.900 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:50.160 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:50.461 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:50.461 "name": "pt2", 00:15:50.461 "aliases": [ 00:15:50.461 "00000000-0000-0000-0000-000000000002" 00:15:50.461 ], 00:15:50.461 "product_name": "passthru", 00:15:50.461 "block_size": 512, 00:15:50.461 "num_blocks": 65536, 00:15:50.461 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:50.461 "assigned_rate_limits": { 00:15:50.461 "rw_ios_per_sec": 0, 00:15:50.461 "rw_mbytes_per_sec": 0, 00:15:50.461 "r_mbytes_per_sec": 0, 00:15:50.461 "w_mbytes_per_sec": 0 00:15:50.461 }, 00:15:50.461 "claimed": true, 00:15:50.461 "claim_type": "exclusive_write", 00:15:50.461 "zoned": false, 00:15:50.461 "supported_io_types": { 00:15:50.461 "read": true, 00:15:50.461 "write": true, 00:15:50.461 "unmap": true, 00:15:50.461 "write_zeroes": true, 00:15:50.461 "flush": true, 00:15:50.461 "reset": true, 00:15:50.461 "compare": false, 00:15:50.461 "compare_and_write": false, 00:15:50.461 "abort": true, 00:15:50.461 "nvme_admin": false, 00:15:50.461 "nvme_io": false 00:15:50.461 }, 00:15:50.461 "memory_domains": [ 00:15:50.461 { 00:15:50.461 "dma_device_id": "system", 00:15:50.461 "dma_device_type": 1 00:15:50.461 }, 00:15:50.461 { 00:15:50.461 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:50.461 "dma_device_type": 2 00:15:50.461 } 00:15:50.462 ], 00:15:50.462 "driver_specific": { 00:15:50.462 "passthru": { 00:15:50.462 "name": "pt2", 00:15:50.462 "base_bdev_name": "malloc2" 00:15:50.462 } 00:15:50.462 } 00:15:50.462 }' 00:15:50.462 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.462 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:50.726 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:50.726 15:54:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.726 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:50.985 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:50.985 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:50.985 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:50.985 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:51.244 "name": "pt3", 00:15:51.244 "aliases": [ 00:15:51.244 "00000000-0000-0000-0000-000000000003" 00:15:51.244 ], 00:15:51.244 "product_name": "passthru", 00:15:51.244 "block_size": 512, 00:15:51.244 "num_blocks": 65536, 00:15:51.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:51.244 "assigned_rate_limits": { 00:15:51.244 "rw_ios_per_sec": 0, 00:15:51.244 "rw_mbytes_per_sec": 0, 00:15:51.244 "r_mbytes_per_sec": 0, 00:15:51.244 "w_mbytes_per_sec": 0 00:15:51.244 }, 00:15:51.244 "claimed": true, 00:15:51.244 "claim_type": "exclusive_write", 00:15:51.244 "zoned": false, 00:15:51.244 "supported_io_types": { 00:15:51.244 "read": true, 00:15:51.244 "write": true, 00:15:51.244 "unmap": true, 00:15:51.244 "write_zeroes": true, 00:15:51.244 "flush": true, 00:15:51.244 "reset": true, 00:15:51.244 "compare": false, 00:15:51.244 "compare_and_write": false, 00:15:51.244 "abort": true, 00:15:51.244 "nvme_admin": false, 00:15:51.244 "nvme_io": false 00:15:51.244 }, 00:15:51.244 "memory_domains": [ 00:15:51.244 { 00:15:51.244 "dma_device_id": "system", 00:15:51.244 "dma_device_type": 1 00:15:51.244 }, 00:15:51.244 { 00:15:51.244 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:51.244 "dma_device_type": 2 00:15:51.244 } 00:15:51.244 ], 00:15:51.244 "driver_specific": { 00:15:51.244 "passthru": { 00:15:51.244 "name": "pt3", 00:15:51.244 "base_bdev_name": "malloc3" 00:15:51.244 } 00:15:51.244 } 00:15:51.244 }' 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.244 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:51.503 15:54:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:15:51.761 [2024-06-10 15:54:57.110173] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:51.761 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=4b24eb3e-7a6e-4346-914a-11501685e5c0 00:15:51.762 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 4b24eb3e-7a6e-4346-914a-11501685e5c0 ']' 00:15:51.762 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:15:52.021 [2024-06-10 15:54:57.366583] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:52.021 [2024-06-10 15:54:57.366603] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:15:52.021 [2024-06-10 15:54:57.366653] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:15:52.021 [2024-06-10 15:54:57.366720] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:15:52.021 [2024-06-10 15:54:57.366729] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261c9f0 name raid_bdev1, state offline 00:15:52.021 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:52.021 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:15:52.280 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:15:52.280 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:15:52.280 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:52.280 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:52.539 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:52.539 15:54:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:52.798 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:15:52.798 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:15:53.057 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:15:53.057 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:15:53.316 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3' -n raid_bdev1 00:15:53.575 [2024-06-10 15:54:58.882560] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:15:53.575 [2024-06-10 15:54:58.883984] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:15:53.575 [2024-06-10 15:54:58.884027] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:15:53.575 [2024-06-10 15:54:58.884072] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:15:53.575 [2024-06-10 15:54:58.884109] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:15:53.575 [2024-06-10 15:54:58.884129] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:15:53.575 [2024-06-10 15:54:58.884144] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:15:53.575 [2024-06-10 15:54:58.884152] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261a3a0 name raid_bdev1, state configuring 00:15:53.575 request: 00:15:53.575 { 00:15:53.575 "name": "raid_bdev1", 00:15:53.575 "raid_level": "raid1", 00:15:53.575 "base_bdevs": [ 00:15:53.575 "malloc1", 00:15:53.575 "malloc2", 00:15:53.575 "malloc3" 00:15:53.575 ], 00:15:53.575 "superblock": false, 00:15:53.575 "method": "bdev_raid_create", 00:15:53.575 "req_id": 1 00:15:53.575 } 00:15:53.575 Got JSON-RPC error response 00:15:53.575 response: 00:15:53.575 { 00:15:53.575 "code": -17, 00:15:53.575 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:15:53.575 } 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:53.575 15:54:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:15:53.835 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:15:53.835 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:15:53.835 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:15:54.094 [2024-06-10 15:54:59.387850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:15:54.094 [2024-06-10 15:54:59.387892] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:54.094 [2024-06-10 15:54:59.387908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2620320 00:15:54.094 [2024-06-10 15:54:59.387918] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:54.094 [2024-06-10 15:54:59.389588] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:54.094 [2024-06-10 15:54:59.389623] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:15:54.094 [2024-06-10 15:54:59.389684] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:15:54.094 [2024-06-10 15:54:59.389710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:15:54.094 pt1 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:54.094 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:54.353 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:54.353 "name": "raid_bdev1", 00:15:54.353 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:54.353 "strip_size_kb": 0, 00:15:54.353 "state": "configuring", 00:15:54.353 "raid_level": "raid1", 00:15:54.353 "superblock": true, 00:15:54.353 "num_base_bdevs": 3, 00:15:54.353 "num_base_bdevs_discovered": 1, 00:15:54.353 "num_base_bdevs_operational": 3, 00:15:54.353 "base_bdevs_list": [ 00:15:54.353 { 00:15:54.353 "name": "pt1", 00:15:54.353 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:54.353 "is_configured": true, 00:15:54.353 "data_offset": 2048, 00:15:54.353 "data_size": 63488 00:15:54.353 }, 00:15:54.353 { 00:15:54.353 "name": null, 00:15:54.353 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:54.353 "is_configured": false, 00:15:54.353 "data_offset": 2048, 00:15:54.353 "data_size": 63488 00:15:54.353 }, 00:15:54.353 { 00:15:54.353 "name": null, 00:15:54.353 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:54.353 "is_configured": false, 00:15:54.353 "data_offset": 2048, 00:15:54.353 "data_size": 63488 00:15:54.353 } 00:15:54.353 ] 00:15:54.353 }' 00:15:54.353 15:54:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:54.353 15:54:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:54.922 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 3 -gt 2 ']' 00:15:54.922 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:55.180 [2024-06-10 15:55:00.486800] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:55.180 [2024-06-10 15:55:00.486845] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:55.180 [2024-06-10 15:55:00.486862] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261fd20 00:15:55.180 [2024-06-10 15:55:00.486872] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:55.180 [2024-06-10 15:55:00.487213] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:55.180 [2024-06-10 15:55:00.487231] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:55.180 [2024-06-10 15:55:00.487288] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:55.180 [2024-06-10 15:55:00.487307] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:55.180 pt2 00:15:55.180 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:15:55.439 [2024-06-10 15:55:00.739496] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:55.440 15:55:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:55.699 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:55.699 "name": "raid_bdev1", 00:15:55.699 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:55.699 "strip_size_kb": 0, 00:15:55.699 "state": "configuring", 00:15:55.699 "raid_level": "raid1", 00:15:55.699 "superblock": true, 00:15:55.699 "num_base_bdevs": 3, 00:15:55.699 "num_base_bdevs_discovered": 1, 00:15:55.699 "num_base_bdevs_operational": 3, 00:15:55.699 "base_bdevs_list": [ 00:15:55.699 { 00:15:55.699 "name": "pt1", 00:15:55.699 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:55.699 "is_configured": true, 00:15:55.699 "data_offset": 2048, 00:15:55.699 "data_size": 63488 00:15:55.699 }, 00:15:55.699 { 00:15:55.699 "name": null, 00:15:55.699 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:55.699 "is_configured": false, 00:15:55.699 "data_offset": 2048, 00:15:55.699 "data_size": 63488 00:15:55.699 }, 00:15:55.699 { 00:15:55.699 "name": null, 00:15:55.699 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:55.699 "is_configured": false, 00:15:55.699 "data_offset": 2048, 00:15:55.699 "data_size": 63488 00:15:55.699 } 00:15:55.699 ] 00:15:55.699 }' 00:15:55.699 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:55.699 15:55:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:15:56.267 [2024-06-10 15:55:01.742172] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:15:56.267 [2024-06-10 15:55:01.742220] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.267 [2024-06-10 15:55:01.742236] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261f240 00:15:56.267 [2024-06-10 15:55:01.742246] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.267 [2024-06-10 15:55:01.742593] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.267 [2024-06-10 15:55:01.742608] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:15:56.267 [2024-06-10 15:55:01.742666] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:15:56.267 [2024-06-10 15:55:01.742683] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:15:56.267 pt2 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:56.267 15:55:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:15:56.527 [2024-06-10 15:55:01.998850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:15:56.527 [2024-06-10 15:55:01.998892] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:15:56.527 [2024-06-10 15:55:01.998908] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2620b60 00:15:56.527 [2024-06-10 15:55:01.998917] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:15:56.527 [2024-06-10 15:55:01.999261] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:15:56.527 [2024-06-10 15:55:01.999277] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:15:56.527 [2024-06-10 15:55:01.999333] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:15:56.527 [2024-06-10 15:55:01.999351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:15:56.527 [2024-06-10 15:55:01.999465] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x261f700 00:15:56.527 [2024-06-10 15:55:01.999474] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:15:56.527 [2024-06-10 15:55:01.999647] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26222d0 00:15:56.527 [2024-06-10 15:55:01.999782] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x261f700 00:15:56.527 [2024-06-10 15:55:01.999791] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x261f700 00:15:56.527 [2024-06-10 15:55:01.999890] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:15:56.527 pt3 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:56.527 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:15:56.791 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:15:56.791 "name": "raid_bdev1", 00:15:56.791 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:56.791 "strip_size_kb": 0, 00:15:56.791 "state": "online", 00:15:56.791 "raid_level": "raid1", 00:15:56.791 "superblock": true, 00:15:56.791 "num_base_bdevs": 3, 00:15:56.791 "num_base_bdevs_discovered": 3, 00:15:56.791 "num_base_bdevs_operational": 3, 00:15:56.791 "base_bdevs_list": [ 00:15:56.791 { 00:15:56.791 "name": "pt1", 00:15:56.791 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:56.791 "is_configured": true, 00:15:56.791 "data_offset": 2048, 00:15:56.791 "data_size": 63488 00:15:56.791 }, 00:15:56.791 { 00:15:56.791 "name": "pt2", 00:15:56.791 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:56.791 "is_configured": true, 00:15:56.791 "data_offset": 2048, 00:15:56.791 "data_size": 63488 00:15:56.791 }, 00:15:56.791 { 00:15:56.791 "name": "pt3", 00:15:56.791 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:56.791 "is_configured": true, 00:15:56.791 "data_offset": 2048, 00:15:56.791 "data_size": 63488 00:15:56.791 } 00:15:56.791 ] 00:15:56.791 }' 00:15:56.791 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:15:56.791 15:55:02 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:15:57.362 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:57.621 15:55:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:15:57.621 [2024-06-10 15:55:03.021843] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:15:57.621 "name": "raid_bdev1", 00:15:57.621 "aliases": [ 00:15:57.621 "4b24eb3e-7a6e-4346-914a-11501685e5c0" 00:15:57.621 ], 00:15:57.621 "product_name": "Raid Volume", 00:15:57.621 "block_size": 512, 00:15:57.621 "num_blocks": 63488, 00:15:57.621 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:57.621 "assigned_rate_limits": { 00:15:57.621 "rw_ios_per_sec": 0, 00:15:57.621 "rw_mbytes_per_sec": 0, 00:15:57.621 "r_mbytes_per_sec": 0, 00:15:57.621 "w_mbytes_per_sec": 0 00:15:57.621 }, 00:15:57.621 "claimed": false, 00:15:57.621 "zoned": false, 00:15:57.621 "supported_io_types": { 00:15:57.621 "read": true, 00:15:57.621 "write": true, 00:15:57.621 "unmap": false, 00:15:57.621 "write_zeroes": true, 00:15:57.621 "flush": false, 00:15:57.621 "reset": true, 00:15:57.621 "compare": false, 00:15:57.621 "compare_and_write": false, 00:15:57.621 "abort": false, 00:15:57.621 "nvme_admin": false, 00:15:57.621 "nvme_io": false 00:15:57.621 }, 00:15:57.621 "memory_domains": [ 00:15:57.621 { 00:15:57.621 "dma_device_id": "system", 00:15:57.621 "dma_device_type": 1 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.621 "dma_device_type": 2 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "dma_device_id": "system", 00:15:57.621 "dma_device_type": 1 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.621 "dma_device_type": 2 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "dma_device_id": "system", 00:15:57.621 "dma_device_type": 1 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.621 "dma_device_type": 2 00:15:57.621 } 00:15:57.621 ], 00:15:57.621 "driver_specific": { 00:15:57.621 "raid": { 00:15:57.621 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:15:57.621 "strip_size_kb": 0, 00:15:57.621 "state": "online", 00:15:57.621 "raid_level": "raid1", 00:15:57.621 "superblock": true, 00:15:57.621 "num_base_bdevs": 3, 00:15:57.621 "num_base_bdevs_discovered": 3, 00:15:57.621 "num_base_bdevs_operational": 3, 00:15:57.621 "base_bdevs_list": [ 00:15:57.621 { 00:15:57.621 "name": "pt1", 00:15:57.621 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:57.621 "is_configured": true, 00:15:57.621 "data_offset": 2048, 00:15:57.621 "data_size": 63488 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "name": "pt2", 00:15:57.621 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:57.621 "is_configured": true, 00:15:57.621 "data_offset": 2048, 00:15:57.621 "data_size": 63488 00:15:57.621 }, 00:15:57.621 { 00:15:57.621 "name": "pt3", 00:15:57.621 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:57.621 "is_configured": true, 00:15:57.621 "data_offset": 2048, 00:15:57.621 "data_size": 63488 00:15:57.621 } 00:15:57.621 ] 00:15:57.621 } 00:15:57.621 } 00:15:57.621 }' 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:15:57.621 pt2 00:15:57.621 pt3' 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:15:57.621 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:57.879 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:57.879 "name": "pt1", 00:15:57.879 "aliases": [ 00:15:57.879 "00000000-0000-0000-0000-000000000001" 00:15:57.879 ], 00:15:57.879 "product_name": "passthru", 00:15:57.879 "block_size": 512, 00:15:57.879 "num_blocks": 65536, 00:15:57.879 "uuid": "00000000-0000-0000-0000-000000000001", 00:15:57.879 "assigned_rate_limits": { 00:15:57.879 "rw_ios_per_sec": 0, 00:15:57.879 "rw_mbytes_per_sec": 0, 00:15:57.879 "r_mbytes_per_sec": 0, 00:15:57.879 "w_mbytes_per_sec": 0 00:15:57.879 }, 00:15:57.879 "claimed": true, 00:15:57.879 "claim_type": "exclusive_write", 00:15:57.879 "zoned": false, 00:15:57.879 "supported_io_types": { 00:15:57.879 "read": true, 00:15:57.879 "write": true, 00:15:57.879 "unmap": true, 00:15:57.879 "write_zeroes": true, 00:15:57.879 "flush": true, 00:15:57.879 "reset": true, 00:15:57.879 "compare": false, 00:15:57.879 "compare_and_write": false, 00:15:57.879 "abort": true, 00:15:57.879 "nvme_admin": false, 00:15:57.879 "nvme_io": false 00:15:57.879 }, 00:15:57.879 "memory_domains": [ 00:15:57.879 { 00:15:57.879 "dma_device_id": "system", 00:15:57.879 "dma_device_type": 1 00:15:57.879 }, 00:15:57.879 { 00:15:57.879 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:57.879 "dma_device_type": 2 00:15:57.879 } 00:15:57.879 ], 00:15:57.879 "driver_specific": { 00:15:57.879 "passthru": { 00:15:57.879 "name": "pt1", 00:15:57.879 "base_bdev_name": "malloc1" 00:15:57.879 } 00:15:57.879 } 00:15:57.879 }' 00:15:57.879 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.138 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.397 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:58.397 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:58.397 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:15:58.397 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:58.655 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:58.655 "name": "pt2", 00:15:58.655 "aliases": [ 00:15:58.655 "00000000-0000-0000-0000-000000000002" 00:15:58.655 ], 00:15:58.655 "product_name": "passthru", 00:15:58.655 "block_size": 512, 00:15:58.655 "num_blocks": 65536, 00:15:58.655 "uuid": "00000000-0000-0000-0000-000000000002", 00:15:58.655 "assigned_rate_limits": { 00:15:58.655 "rw_ios_per_sec": 0, 00:15:58.655 "rw_mbytes_per_sec": 0, 00:15:58.655 "r_mbytes_per_sec": 0, 00:15:58.655 "w_mbytes_per_sec": 0 00:15:58.655 }, 00:15:58.655 "claimed": true, 00:15:58.655 "claim_type": "exclusive_write", 00:15:58.655 "zoned": false, 00:15:58.655 "supported_io_types": { 00:15:58.655 "read": true, 00:15:58.655 "write": true, 00:15:58.655 "unmap": true, 00:15:58.655 "write_zeroes": true, 00:15:58.655 "flush": true, 00:15:58.655 "reset": true, 00:15:58.655 "compare": false, 00:15:58.655 "compare_and_write": false, 00:15:58.655 "abort": true, 00:15:58.655 "nvme_admin": false, 00:15:58.655 "nvme_io": false 00:15:58.655 }, 00:15:58.655 "memory_domains": [ 00:15:58.655 { 00:15:58.655 "dma_device_id": "system", 00:15:58.655 "dma_device_type": 1 00:15:58.655 }, 00:15:58.655 { 00:15:58.655 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:58.655 "dma_device_type": 2 00:15:58.655 } 00:15:58.655 ], 00:15:58.655 "driver_specific": { 00:15:58.655 "passthru": { 00:15:58.655 "name": "pt2", 00:15:58.655 "base_bdev_name": "malloc2" 00:15:58.655 } 00:15:58.655 } 00:15:58.655 }' 00:15:58.655 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.655 15:55:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.655 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:58.913 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:58.913 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.914 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:58.914 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:58.914 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:15:58.914 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:15:58.914 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:15:59.172 "name": "pt3", 00:15:59.172 "aliases": [ 00:15:59.172 "00000000-0000-0000-0000-000000000003" 00:15:59.172 ], 00:15:59.172 "product_name": "passthru", 00:15:59.172 "block_size": 512, 00:15:59.172 "num_blocks": 65536, 00:15:59.172 "uuid": "00000000-0000-0000-0000-000000000003", 00:15:59.172 "assigned_rate_limits": { 00:15:59.172 "rw_ios_per_sec": 0, 00:15:59.172 "rw_mbytes_per_sec": 0, 00:15:59.172 "r_mbytes_per_sec": 0, 00:15:59.172 "w_mbytes_per_sec": 0 00:15:59.172 }, 00:15:59.172 "claimed": true, 00:15:59.172 "claim_type": "exclusive_write", 00:15:59.172 "zoned": false, 00:15:59.172 "supported_io_types": { 00:15:59.172 "read": true, 00:15:59.172 "write": true, 00:15:59.172 "unmap": true, 00:15:59.172 "write_zeroes": true, 00:15:59.172 "flush": true, 00:15:59.172 "reset": true, 00:15:59.172 "compare": false, 00:15:59.172 "compare_and_write": false, 00:15:59.172 "abort": true, 00:15:59.172 "nvme_admin": false, 00:15:59.172 "nvme_io": false 00:15:59.172 }, 00:15:59.172 "memory_domains": [ 00:15:59.172 { 00:15:59.172 "dma_device_id": "system", 00:15:59.172 "dma_device_type": 1 00:15:59.172 }, 00:15:59.172 { 00:15:59.172 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:15:59.172 "dma_device_type": 2 00:15:59.172 } 00:15:59.172 ], 00:15:59.172 "driver_specific": { 00:15:59.172 "passthru": { 00:15:59.172 "name": "pt3", 00:15:59.172 "base_bdev_name": "malloc3" 00:15:59.172 } 00:15:59.172 } 00:15:59.172 }' 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.172 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:15:59.430 15:55:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:15:59.688 [2024-06-10 15:55:05.147547] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:15:59.688 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 4b24eb3e-7a6e-4346-914a-11501685e5c0 '!=' 4b24eb3e-7a6e-4346-914a-11501685e5c0 ']' 00:15:59.688 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:15:59.688 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:15:59.688 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:15:59.688 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:15:59.947 [2024-06-10 15:55:05.408016] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:15:59.947 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:00.206 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:00.206 "name": "raid_bdev1", 00:16:00.206 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:16:00.206 "strip_size_kb": 0, 00:16:00.206 "state": "online", 00:16:00.206 "raid_level": "raid1", 00:16:00.206 "superblock": true, 00:16:00.206 "num_base_bdevs": 3, 00:16:00.206 "num_base_bdevs_discovered": 2, 00:16:00.206 "num_base_bdevs_operational": 2, 00:16:00.206 "base_bdevs_list": [ 00:16:00.206 { 00:16:00.206 "name": null, 00:16:00.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:00.206 "is_configured": false, 00:16:00.206 "data_offset": 2048, 00:16:00.206 "data_size": 63488 00:16:00.206 }, 00:16:00.206 { 00:16:00.206 "name": "pt2", 00:16:00.206 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:00.206 "is_configured": true, 00:16:00.206 "data_offset": 2048, 00:16:00.206 "data_size": 63488 00:16:00.206 }, 00:16:00.206 { 00:16:00.206 "name": "pt3", 00:16:00.206 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:00.206 "is_configured": true, 00:16:00.206 "data_offset": 2048, 00:16:00.206 "data_size": 63488 00:16:00.206 } 00:16:00.206 ] 00:16:00.206 }' 00:16:00.206 15:55:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:00.206 15:55:05 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:00.772 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:01.064 [2024-06-10 15:55:06.398762] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:01.064 [2024-06-10 15:55:06.398787] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:01.064 [2024-06-10 15:55:06.398838] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:01.064 [2024-06-10 15:55:06.398893] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:01.064 [2024-06-10 15:55:06.398902] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x261f700 name raid_bdev1, state offline 00:16:01.064 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:01.064 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:16:01.323 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:16:01.323 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:16:01.323 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:16:01.323 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:01.323 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:16:01.582 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:01.582 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:01.582 15:55:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:01.841 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:16:01.841 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:16:01.841 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:16:01.841 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:01.841 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:16:01.841 [2024-06-10 15:55:07.345227] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:16:01.841 [2024-06-10 15:55:07.345267] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:01.841 [2024-06-10 15:55:07.345282] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261f470 00:16:01.841 [2024-06-10 15:55:07.345292] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:01.841 [2024-06-10 15:55:07.346974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:01.841 [2024-06-10 15:55:07.347001] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:16:01.841 [2024-06-10 15:55:07.347059] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:16:01.841 [2024-06-10 15:55:07.347083] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:01.841 pt2 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:02.100 "name": "raid_bdev1", 00:16:02.100 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:16:02.100 "strip_size_kb": 0, 00:16:02.100 "state": "configuring", 00:16:02.100 "raid_level": "raid1", 00:16:02.100 "superblock": true, 00:16:02.100 "num_base_bdevs": 3, 00:16:02.100 "num_base_bdevs_discovered": 1, 00:16:02.100 "num_base_bdevs_operational": 2, 00:16:02.100 "base_bdevs_list": [ 00:16:02.100 { 00:16:02.100 "name": null, 00:16:02.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:02.100 "is_configured": false, 00:16:02.100 "data_offset": 2048, 00:16:02.100 "data_size": 63488 00:16:02.100 }, 00:16:02.100 { 00:16:02.100 "name": "pt2", 00:16:02.100 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:02.100 "is_configured": true, 00:16:02.100 "data_offset": 2048, 00:16:02.100 "data_size": 63488 00:16:02.100 }, 00:16:02.100 { 00:16:02.100 "name": null, 00:16:02.100 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:02.100 "is_configured": false, 00:16:02.100 "data_offset": 2048, 00:16:02.100 "data_size": 63488 00:16:02.100 } 00:16:02.100 ] 00:16:02.100 }' 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:02.100 15:55:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:02.667 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:16:02.667 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:16:02.667 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=2 00:16:02.667 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:02.926 [2024-06-10 15:55:08.376008] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:02.926 [2024-06-10 15:55:08.376050] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:02.926 [2024-06-10 15:55:08.376067] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x246ffc0 00:16:02.926 [2024-06-10 15:55:08.376077] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:02.926 [2024-06-10 15:55:08.376410] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:02.926 [2024-06-10 15:55:08.376426] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:02.926 [2024-06-10 15:55:08.376480] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:02.926 [2024-06-10 15:55:08.376498] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:02.926 [2024-06-10 15:55:08.376602] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x246f960 00:16:02.926 [2024-06-10 15:55:08.376611] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:02.926 [2024-06-10 15:55:08.376781] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x261c220 00:16:02.926 [2024-06-10 15:55:08.376912] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x246f960 00:16:02.926 [2024-06-10 15:55:08.376921] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x246f960 00:16:02.926 [2024-06-10 15:55:08.377035] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:02.926 pt3 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:02.926 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:03.186 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:03.186 "name": "raid_bdev1", 00:16:03.186 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:16:03.186 "strip_size_kb": 0, 00:16:03.186 "state": "online", 00:16:03.186 "raid_level": "raid1", 00:16:03.186 "superblock": true, 00:16:03.186 "num_base_bdevs": 3, 00:16:03.186 "num_base_bdevs_discovered": 2, 00:16:03.186 "num_base_bdevs_operational": 2, 00:16:03.186 "base_bdevs_list": [ 00:16:03.186 { 00:16:03.186 "name": null, 00:16:03.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:03.186 "is_configured": false, 00:16:03.186 "data_offset": 2048, 00:16:03.186 "data_size": 63488 00:16:03.186 }, 00:16:03.186 { 00:16:03.186 "name": "pt2", 00:16:03.186 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:03.186 "is_configured": true, 00:16:03.186 "data_offset": 2048, 00:16:03.186 "data_size": 63488 00:16:03.186 }, 00:16:03.186 { 00:16:03.186 "name": "pt3", 00:16:03.186 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:03.186 "is_configured": true, 00:16:03.186 "data_offset": 2048, 00:16:03.186 "data_size": 63488 00:16:03.186 } 00:16:03.186 ] 00:16:03.186 }' 00:16:03.186 15:55:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:03.186 15:55:08 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:04.124 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:04.124 [2024-06-10 15:55:09.446856] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:04.124 [2024-06-10 15:55:09.446881] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:04.124 [2024-06-10 15:55:09.446930] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:04.124 [2024-06-10 15:55:09.446991] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:04.124 [2024-06-10 15:55:09.447001] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x246f960 name raid_bdev1, state offline 00:16:04.124 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.124 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:16:04.382 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:16:04.382 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:16:04.382 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 3 -gt 2 ']' 00:16:04.382 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=2 00:16:04.382 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:16:04.688 15:55:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:16:04.946 [2024-06-10 15:55:10.228909] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:16:04.947 [2024-06-10 15:55:10.228951] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:04.947 [2024-06-10 15:55:10.228974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x261b1b0 00:16:04.947 [2024-06-10 15:55:10.228984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:04.947 [2024-06-10 15:55:10.230662] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:04.947 [2024-06-10 15:55:10.230690] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:16:04.947 [2024-06-10 15:55:10.230750] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:16:04.947 [2024-06-10 15:55:10.230775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:16:04.947 [2024-06-10 15:55:10.230872] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:16:04.947 [2024-06-10 15:55:10.230883] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:04.947 [2024-06-10 15:55:10.230895] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2620c10 name raid_bdev1, state configuring 00:16:04.947 [2024-06-10 15:55:10.230917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:16:04.947 pt1 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 3 -gt 2 ']' 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:04.947 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:05.206 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:05.206 "name": "raid_bdev1", 00:16:05.206 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:16:05.206 "strip_size_kb": 0, 00:16:05.206 "state": "configuring", 00:16:05.206 "raid_level": "raid1", 00:16:05.206 "superblock": true, 00:16:05.206 "num_base_bdevs": 3, 00:16:05.206 "num_base_bdevs_discovered": 1, 00:16:05.206 "num_base_bdevs_operational": 2, 00:16:05.206 "base_bdevs_list": [ 00:16:05.206 { 00:16:05.206 "name": null, 00:16:05.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:05.206 "is_configured": false, 00:16:05.206 "data_offset": 2048, 00:16:05.206 "data_size": 63488 00:16:05.206 }, 00:16:05.206 { 00:16:05.206 "name": "pt2", 00:16:05.206 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:05.206 "is_configured": true, 00:16:05.206 "data_offset": 2048, 00:16:05.206 "data_size": 63488 00:16:05.206 }, 00:16:05.206 { 00:16:05.206 "name": null, 00:16:05.206 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:05.206 "is_configured": false, 00:16:05.206 "data_offset": 2048, 00:16:05.206 "data_size": 63488 00:16:05.206 } 00:16:05.206 ] 00:16:05.206 }' 00:16:05.206 15:55:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:05.206 15:55:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:05.774 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:16:05.774 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:05.774 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:16:05.774 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:16:06.033 [2024-06-10 15:55:11.508327] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:16:06.033 [2024-06-10 15:55:11.508371] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:06.033 [2024-06-10 15:55:11.508390] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26214e0 00:16:06.033 [2024-06-10 15:55:11.508399] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:06.033 [2024-06-10 15:55:11.508742] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:06.033 [2024-06-10 15:55:11.508758] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:16:06.033 [2024-06-10 15:55:11.508812] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:16:06.033 [2024-06-10 15:55:11.508830] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:16:06.033 [2024-06-10 15:55:11.508927] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x26208f0 00:16:06.033 [2024-06-10 15:55:11.508936] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:06.033 [2024-06-10 15:55:11.509121] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x261c9c0 00:16:06.033 [2024-06-10 15:55:11.509253] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x26208f0 00:16:06.033 [2024-06-10 15:55:11.509262] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x26208f0 00:16:06.033 [2024-06-10 15:55:11.509361] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:06.033 pt3 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:06.033 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:06.292 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:06.292 "name": "raid_bdev1", 00:16:06.292 "uuid": "4b24eb3e-7a6e-4346-914a-11501685e5c0", 00:16:06.292 "strip_size_kb": 0, 00:16:06.292 "state": "online", 00:16:06.292 "raid_level": "raid1", 00:16:06.292 "superblock": true, 00:16:06.292 "num_base_bdevs": 3, 00:16:06.292 "num_base_bdevs_discovered": 2, 00:16:06.292 "num_base_bdevs_operational": 2, 00:16:06.292 "base_bdevs_list": [ 00:16:06.292 { 00:16:06.292 "name": null, 00:16:06.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:06.292 "is_configured": false, 00:16:06.292 "data_offset": 2048, 00:16:06.292 "data_size": 63488 00:16:06.292 }, 00:16:06.292 { 00:16:06.292 "name": "pt2", 00:16:06.292 "uuid": "00000000-0000-0000-0000-000000000002", 00:16:06.292 "is_configured": true, 00:16:06.292 "data_offset": 2048, 00:16:06.292 "data_size": 63488 00:16:06.292 }, 00:16:06.292 { 00:16:06.292 "name": "pt3", 00:16:06.292 "uuid": "00000000-0000-0000-0000-000000000003", 00:16:06.292 "is_configured": true, 00:16:06.292 "data_offset": 2048, 00:16:06.292 "data_size": 63488 00:16:06.292 } 00:16:06.292 ] 00:16:06.292 }' 00:16:06.292 15:55:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:06.292 15:55:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:06.860 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:16:06.860 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:16:07.118 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:16:07.118 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:16:07.118 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:16:07.377 [2024-06-10 15:55:12.679872] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 4b24eb3e-7a6e-4346-914a-11501685e5c0 '!=' 4b24eb3e-7a6e-4346-914a-11501685e5c0 ']' 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2700088 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2700088 ']' 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2700088 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2700088 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2700088' 00:16:07.377 killing process with pid 2700088 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2700088 00:16:07.377 [2024-06-10 15:55:12.750906] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:07.377 [2024-06-10 15:55:12.750961] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:07.377 [2024-06-10 15:55:12.751017] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:07.377 [2024-06-10 15:55:12.751032] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x26208f0 name raid_bdev1, state offline 00:16:07.377 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2700088 00:16:07.377 [2024-06-10 15:55:12.776622] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:07.635 15:55:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:16:07.635 00:16:07.635 real 0m21.906s 00:16:07.635 user 0m40.883s 00:16:07.635 sys 0m3.114s 00:16:07.635 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:07.635 15:55:12 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.635 ************************************ 00:16:07.635 END TEST raid_superblock_test 00:16:07.635 ************************************ 00:16:07.635 15:55:13 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 3 read 00:16:07.635 15:55:13 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:07.635 15:55:13 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:07.635 15:55:13 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:07.635 ************************************ 00:16:07.635 START TEST raid_read_error_test 00:16:07.635 ************************************ 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 read 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:07.635 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.RIC0Hnva1A 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2704036 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2704036 /var/tmp/spdk-raid.sock 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2704036 ']' 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:07.636 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:07.636 15:55:13 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:07.636 [2024-06-10 15:55:13.116205] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:16:07.636 [2024-06-10 15:55:13.116258] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2704036 ] 00:16:07.895 [2024-06-10 15:55:13.213685] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:07.895 [2024-06-10 15:55:13.308315] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:07.895 [2024-06-10 15:55:13.377029] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:07.895 [2024-06-10 15:55:13.377068] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:08.833 15:55:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:08.833 15:55:14 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:16:08.833 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:08.833 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:08.833 BaseBdev1_malloc 00:16:08.833 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:09.092 true 00:16:09.092 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:09.351 [2024-06-10 15:55:14.812624] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:09.351 [2024-06-10 15:55:14.812667] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:09.351 [2024-06-10 15:55:14.812684] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1748150 00:16:09.351 [2024-06-10 15:55:14.812694] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:09.351 [2024-06-10 15:55:14.814520] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:09.351 [2024-06-10 15:55:14.814548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:09.351 BaseBdev1 00:16:09.351 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:09.351 15:55:14 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:09.610 BaseBdev2_malloc 00:16:09.610 15:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:09.869 true 00:16:09.869 15:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:10.128 [2024-06-10 15:55:15.579230] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:10.128 [2024-06-10 15:55:15.579271] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.128 [2024-06-10 15:55:15.579293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x174cb50 00:16:10.128 [2024-06-10 15:55:15.579303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.128 [2024-06-10 15:55:15.580888] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.128 [2024-06-10 15:55:15.580914] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:10.128 BaseBdev2 00:16:10.128 15:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:10.128 15:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:10.387 BaseBdev3_malloc 00:16:10.387 15:55:15 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:10.646 true 00:16:10.646 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:10.905 [2024-06-10 15:55:16.317773] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:10.905 [2024-06-10 15:55:16.317814] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:10.905 [2024-06-10 15:55:16.317830] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x174d780 00:16:10.905 [2024-06-10 15:55:16.317839] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:10.905 [2024-06-10 15:55:16.319375] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:10.905 [2024-06-10 15:55:16.319403] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:10.905 BaseBdev3 00:16:10.905 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:11.164 [2024-06-10 15:55:16.566458] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:11.164 [2024-06-10 15:55:16.567824] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:11.164 [2024-06-10 15:55:16.567895] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:11.164 [2024-06-10 15:55:16.568119] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x17505a0 00:16:11.164 [2024-06-10 15:55:16.568131] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:11.164 [2024-06-10 15:55:16.568331] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1749b20 00:16:11.164 [2024-06-10 15:55:16.568492] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x17505a0 00:16:11.164 [2024-06-10 15:55:16.568501] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x17505a0 00:16:11.164 [2024-06-10 15:55:16.568604] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:11.164 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:11.423 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:11.423 "name": "raid_bdev1", 00:16:11.423 "uuid": "1ef9564c-b355-40d7-b55f-7ee83f58d8ac", 00:16:11.423 "strip_size_kb": 0, 00:16:11.423 "state": "online", 00:16:11.423 "raid_level": "raid1", 00:16:11.423 "superblock": true, 00:16:11.423 "num_base_bdevs": 3, 00:16:11.423 "num_base_bdevs_discovered": 3, 00:16:11.423 "num_base_bdevs_operational": 3, 00:16:11.423 "base_bdevs_list": [ 00:16:11.423 { 00:16:11.423 "name": "BaseBdev1", 00:16:11.423 "uuid": "4a84f1e9-3614-50dd-b3b4-1411010ea91c", 00:16:11.423 "is_configured": true, 00:16:11.423 "data_offset": 2048, 00:16:11.423 "data_size": 63488 00:16:11.423 }, 00:16:11.423 { 00:16:11.423 "name": "BaseBdev2", 00:16:11.423 "uuid": "24b9578f-2f2f-53cf-93c3-afe34e20ffd3", 00:16:11.423 "is_configured": true, 00:16:11.423 "data_offset": 2048, 00:16:11.423 "data_size": 63488 00:16:11.423 }, 00:16:11.423 { 00:16:11.423 "name": "BaseBdev3", 00:16:11.423 "uuid": "8b16a2ee-95db-5149-990f-84a6a496d6ab", 00:16:11.423 "is_configured": true, 00:16:11.423 "data_offset": 2048, 00:16:11.423 "data_size": 63488 00:16:11.423 } 00:16:11.423 ] 00:16:11.423 }' 00:16:11.423 15:55:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:11.423 15:55:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:11.991 15:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:11.991 15:55:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:12.250 [2024-06-10 15:55:17.589443] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x17494c0 00:16:13.187 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=3 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:13.446 "name": "raid_bdev1", 00:16:13.446 "uuid": "1ef9564c-b355-40d7-b55f-7ee83f58d8ac", 00:16:13.446 "strip_size_kb": 0, 00:16:13.446 "state": "online", 00:16:13.446 "raid_level": "raid1", 00:16:13.446 "superblock": true, 00:16:13.446 "num_base_bdevs": 3, 00:16:13.446 "num_base_bdevs_discovered": 3, 00:16:13.446 "num_base_bdevs_operational": 3, 00:16:13.446 "base_bdevs_list": [ 00:16:13.446 { 00:16:13.446 "name": "BaseBdev1", 00:16:13.446 "uuid": "4a84f1e9-3614-50dd-b3b4-1411010ea91c", 00:16:13.446 "is_configured": true, 00:16:13.446 "data_offset": 2048, 00:16:13.446 "data_size": 63488 00:16:13.446 }, 00:16:13.446 { 00:16:13.446 "name": "BaseBdev2", 00:16:13.446 "uuid": "24b9578f-2f2f-53cf-93c3-afe34e20ffd3", 00:16:13.446 "is_configured": true, 00:16:13.446 "data_offset": 2048, 00:16:13.446 "data_size": 63488 00:16:13.446 }, 00:16:13.446 { 00:16:13.446 "name": "BaseBdev3", 00:16:13.446 "uuid": "8b16a2ee-95db-5149-990f-84a6a496d6ab", 00:16:13.446 "is_configured": true, 00:16:13.446 "data_offset": 2048, 00:16:13.446 "data_size": 63488 00:16:13.446 } 00:16:13.446 ] 00:16:13.446 }' 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:13.446 15:55:18 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.017 15:55:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:14.276 [2024-06-10 15:55:19.727557] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:14.276 [2024-06-10 15:55:19.727598] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:14.277 [2024-06-10 15:55:19.731015] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:14.277 [2024-06-10 15:55:19.731048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:14.277 [2024-06-10 15:55:19.731151] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:14.277 [2024-06-10 15:55:19.731160] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x17505a0 name raid_bdev1, state offline 00:16:14.277 0 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2704036 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2704036 ']' 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2704036 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:14.277 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2704036 00:16:14.536 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:14.536 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:14.536 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2704036' 00:16:14.536 killing process with pid 2704036 00:16:14.536 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2704036 00:16:14.536 [2024-06-10 15:55:19.790151] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:14.536 15:55:19 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2704036 00:16:14.536 [2024-06-10 15:55:19.810063] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.RIC0Hnva1A 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:14.536 00:16:14.536 real 0m6.978s 00:16:14.536 user 0m11.324s 00:16:14.536 sys 0m0.991s 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:14.536 15:55:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.536 ************************************ 00:16:14.536 END TEST raid_read_error_test 00:16:14.536 ************************************ 00:16:14.795 15:55:20 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 3 write 00:16:14.795 15:55:20 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:14.795 15:55:20 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:14.795 15:55:20 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:14.795 ************************************ 00:16:14.795 START TEST raid_write_error_test 00:16:14.795 ************************************ 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 3 write 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=3 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3') 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.r3V3sJS6XN 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2705314 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2705314 /var/tmp/spdk-raid.sock 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2705314 ']' 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:14.795 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:14.795 15:55:20 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:14.795 [2024-06-10 15:55:20.157170] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:16:14.795 [2024-06-10 15:55:20.157225] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2705314 ] 00:16:14.795 [2024-06-10 15:55:20.255267] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:15.054 [2024-06-10 15:55:20.349508] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:15.054 [2024-06-10 15:55:20.409642] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:15.054 [2024-06-10 15:55:20.409678] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:15.623 15:55:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:15.623 15:55:21 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:16:15.623 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:15.623 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:16:15.882 BaseBdev1_malloc 00:16:15.882 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:16:16.140 true 00:16:16.140 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:16:16.399 [2024-06-10 15:55:21.847898] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:16:16.399 [2024-06-10 15:55:21.847938] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:16.399 [2024-06-10 15:55:21.847964] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x144e150 00:16:16.399 [2024-06-10 15:55:21.847974] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:16.399 [2024-06-10 15:55:21.849752] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:16.399 [2024-06-10 15:55:21.849780] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:16:16.399 BaseBdev1 00:16:16.399 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:16.399 15:55:21 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:16:16.658 BaseBdev2_malloc 00:16:16.658 15:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:16:16.915 true 00:16:16.915 15:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:16:17.174 [2024-06-10 15:55:22.602433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:16:17.174 [2024-06-10 15:55:22.602472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.174 [2024-06-10 15:55:22.602491] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1452b50 00:16:17.174 [2024-06-10 15:55:22.602502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.174 [2024-06-10 15:55:22.604090] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.174 [2024-06-10 15:55:22.604117] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:16:17.174 BaseBdev2 00:16:17.174 15:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:16:17.174 15:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:16:17.434 BaseBdev3_malloc 00:16:17.434 15:55:22 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:16:17.693 true 00:16:17.693 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:16:17.953 [2024-06-10 15:55:23.357027] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:16:17.953 [2024-06-10 15:55:23.357067] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:16:17.953 [2024-06-10 15:55:23.357087] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1453780 00:16:17.953 [2024-06-10 15:55:23.357096] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:16:17.953 [2024-06-10 15:55:23.358632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:16:17.953 [2024-06-10 15:55:23.358658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:16:17.953 BaseBdev3 00:16:17.953 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3' -n raid_bdev1 -s 00:16:18.212 [2024-06-10 15:55:23.597679] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:18.212 [2024-06-10 15:55:23.598987] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:18.212 [2024-06-10 15:55:23.599058] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:18.212 [2024-06-10 15:55:23.599270] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x14565a0 00:16:18.212 [2024-06-10 15:55:23.599280] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:16:18.212 [2024-06-10 15:55:23.599473] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x144fb20 00:16:18.212 [2024-06-10 15:55:23.599631] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x14565a0 00:16:18.212 [2024-06-10 15:55:23.599640] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x14565a0 00:16:18.212 [2024-06-10 15:55:23.599743] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:18.212 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:18.471 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:18.471 "name": "raid_bdev1", 00:16:18.471 "uuid": "e492a91e-b7f0-4cc5-ab6b-93aaa8dba8b1", 00:16:18.471 "strip_size_kb": 0, 00:16:18.471 "state": "online", 00:16:18.471 "raid_level": "raid1", 00:16:18.471 "superblock": true, 00:16:18.471 "num_base_bdevs": 3, 00:16:18.471 "num_base_bdevs_discovered": 3, 00:16:18.471 "num_base_bdevs_operational": 3, 00:16:18.471 "base_bdevs_list": [ 00:16:18.471 { 00:16:18.471 "name": "BaseBdev1", 00:16:18.471 "uuid": "2c0f6d82-fa8e-5712-9433-35df0dcbdd60", 00:16:18.471 "is_configured": true, 00:16:18.471 "data_offset": 2048, 00:16:18.471 "data_size": 63488 00:16:18.471 }, 00:16:18.471 { 00:16:18.471 "name": "BaseBdev2", 00:16:18.471 "uuid": "bb78cc22-713d-5f82-8027-5cc1cc30ad26", 00:16:18.471 "is_configured": true, 00:16:18.471 "data_offset": 2048, 00:16:18.471 "data_size": 63488 00:16:18.471 }, 00:16:18.471 { 00:16:18.471 "name": "BaseBdev3", 00:16:18.471 "uuid": "6f18d4ef-5ef6-590c-b8df-23daadd65de5", 00:16:18.471 "is_configured": true, 00:16:18.471 "data_offset": 2048, 00:16:18.471 "data_size": 63488 00:16:18.471 } 00:16:18.471 ] 00:16:18.471 }' 00:16:18.471 15:55:23 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:18.471 15:55:23 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:19.059 15:55:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:16:19.059 15:55:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:16:19.059 [2024-06-10 15:55:24.544466] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x144f4c0 00:16:20.002 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:16:20.260 [2024-06-10 15:55:25.705349] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:16:20.260 [2024-06-10 15:55:25.705396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:20.260 [2024-06-10 15:55:25.705593] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x144f4c0 00:16:20.260 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:16:20.260 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:16:20.260 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:16:20.260 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=2 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:20.261 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:16:20.519 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:20.519 "name": "raid_bdev1", 00:16:20.519 "uuid": "e492a91e-b7f0-4cc5-ab6b-93aaa8dba8b1", 00:16:20.519 "strip_size_kb": 0, 00:16:20.519 "state": "online", 00:16:20.519 "raid_level": "raid1", 00:16:20.519 "superblock": true, 00:16:20.519 "num_base_bdevs": 3, 00:16:20.519 "num_base_bdevs_discovered": 2, 00:16:20.519 "num_base_bdevs_operational": 2, 00:16:20.519 "base_bdevs_list": [ 00:16:20.519 { 00:16:20.519 "name": null, 00:16:20.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:20.519 "is_configured": false, 00:16:20.519 "data_offset": 2048, 00:16:20.519 "data_size": 63488 00:16:20.519 }, 00:16:20.519 { 00:16:20.519 "name": "BaseBdev2", 00:16:20.519 "uuid": "bb78cc22-713d-5f82-8027-5cc1cc30ad26", 00:16:20.519 "is_configured": true, 00:16:20.519 "data_offset": 2048, 00:16:20.519 "data_size": 63488 00:16:20.519 }, 00:16:20.519 { 00:16:20.519 "name": "BaseBdev3", 00:16:20.519 "uuid": "6f18d4ef-5ef6-590c-b8df-23daadd65de5", 00:16:20.519 "is_configured": true, 00:16:20.519 "data_offset": 2048, 00:16:20.519 "data_size": 63488 00:16:20.519 } 00:16:20.519 ] 00:16:20.519 }' 00:16:20.519 15:55:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:20.519 15:55:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.086 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:16:21.345 [2024-06-10 15:55:26.683368] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:16:21.345 [2024-06-10 15:55:26.683406] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:21.345 [2024-06-10 15:55:26.686764] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:21.345 [2024-06-10 15:55:26.686795] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:21.345 [2024-06-10 15:55:26.686872] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:21.345 [2024-06-10 15:55:26.686881] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x14565a0 name raid_bdev1, state offline 00:16:21.345 0 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2705314 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2705314 ']' 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2705314 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2705314 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2705314' 00:16:21.345 killing process with pid 2705314 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2705314 00:16:21.345 [2024-06-10 15:55:26.745393] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:21.345 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2705314 00:16:21.345 [2024-06-10 15:55:26.764374] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.r3V3sJS6XN 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:16:21.604 00:16:21.604 real 0m6.893s 00:16:21.604 user 0m11.189s 00:16:21.604 sys 0m0.932s 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:21.604 15:55:26 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.604 ************************************ 00:16:21.604 END TEST raid_write_error_test 00:16:21.604 ************************************ 00:16:21.604 15:55:27 bdev_raid -- bdev/bdev_raid.sh@865 -- # for n in {2..4} 00:16:21.604 15:55:27 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:16:21.604 15:55:27 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid0 4 false 00:16:21.604 15:55:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:21.604 15:55:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:21.604 15:55:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:21.604 ************************************ 00:16:21.604 START TEST raid_state_function_test 00:16:21.604 ************************************ 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 false 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2706553 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2706553' 00:16:21.604 Process raid pid: 2706553 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2706553 /var/tmp/spdk-raid.sock 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2706553 ']' 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:21.604 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:21.604 15:55:27 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:21.604 [2024-06-10 15:55:27.107386] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:16:21.604 [2024-06-10 15:55:27.107443] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:21.864 [2024-06-10 15:55:27.205694] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:21.864 [2024-06-10 15:55:27.299841] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:21.864 [2024-06-10 15:55:27.353927] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:21.864 [2024-06-10 15:55:27.353952] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:22.798 [2024-06-10 15:55:28.284366] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:22.798 [2024-06-10 15:55:28.284405] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:22.798 [2024-06-10 15:55:28.284414] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:22.798 [2024-06-10 15:55:28.284422] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:22.798 [2024-06-10 15:55:28.284429] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:22.798 [2024-06-10 15:55:28.284438] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:22.798 [2024-06-10 15:55:28.284445] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:22.798 [2024-06-10 15:55:28.284452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:22.798 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:23.056 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:23.056 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:23.056 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:23.056 "name": "Existed_Raid", 00:16:23.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.056 "strip_size_kb": 64, 00:16:23.056 "state": "configuring", 00:16:23.056 "raid_level": "raid0", 00:16:23.056 "superblock": false, 00:16:23.056 "num_base_bdevs": 4, 00:16:23.056 "num_base_bdevs_discovered": 0, 00:16:23.056 "num_base_bdevs_operational": 4, 00:16:23.056 "base_bdevs_list": [ 00:16:23.056 { 00:16:23.056 "name": "BaseBdev1", 00:16:23.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.056 "is_configured": false, 00:16:23.056 "data_offset": 0, 00:16:23.056 "data_size": 0 00:16:23.056 }, 00:16:23.056 { 00:16:23.056 "name": "BaseBdev2", 00:16:23.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.056 "is_configured": false, 00:16:23.056 "data_offset": 0, 00:16:23.056 "data_size": 0 00:16:23.056 }, 00:16:23.056 { 00:16:23.056 "name": "BaseBdev3", 00:16:23.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.056 "is_configured": false, 00:16:23.056 "data_offset": 0, 00:16:23.056 "data_size": 0 00:16:23.056 }, 00:16:23.056 { 00:16:23.056 "name": "BaseBdev4", 00:16:23.056 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:23.056 "is_configured": false, 00:16:23.056 "data_offset": 0, 00:16:23.056 "data_size": 0 00:16:23.056 } 00:16:23.056 ] 00:16:23.056 }' 00:16:23.056 15:55:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:23.056 15:55:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:23.990 15:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:23.990 [2024-06-10 15:55:29.427449] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:23.990 [2024-06-10 15:55:29.427479] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ae140 name Existed_Raid, state configuring 00:16:23.990 15:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:24.248 [2024-06-10 15:55:29.599928] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:24.248 [2024-06-10 15:55:29.599952] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:24.248 [2024-06-10 15:55:29.599967] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:24.248 [2024-06-10 15:55:29.599976] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:24.248 [2024-06-10 15:55:29.599983] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:24.248 [2024-06-10 15:55:29.599991] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:24.248 [2024-06-10 15:55:29.599998] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:24.248 [2024-06-10 15:55:29.600006] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:24.248 15:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:24.507 [2024-06-10 15:55:29.781934] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:24.507 BaseBdev1 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:24.507 15:55:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:24.765 [ 00:16:24.765 { 00:16:24.765 "name": "BaseBdev1", 00:16:24.765 "aliases": [ 00:16:24.765 "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e" 00:16:24.765 ], 00:16:24.765 "product_name": "Malloc disk", 00:16:24.765 "block_size": 512, 00:16:24.765 "num_blocks": 65536, 00:16:24.765 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:24.765 "assigned_rate_limits": { 00:16:24.765 "rw_ios_per_sec": 0, 00:16:24.765 "rw_mbytes_per_sec": 0, 00:16:24.765 "r_mbytes_per_sec": 0, 00:16:24.765 "w_mbytes_per_sec": 0 00:16:24.765 }, 00:16:24.765 "claimed": true, 00:16:24.765 "claim_type": "exclusive_write", 00:16:24.765 "zoned": false, 00:16:24.765 "supported_io_types": { 00:16:24.765 "read": true, 00:16:24.765 "write": true, 00:16:24.765 "unmap": true, 00:16:24.765 "write_zeroes": true, 00:16:24.765 "flush": true, 00:16:24.765 "reset": true, 00:16:24.765 "compare": false, 00:16:24.765 "compare_and_write": false, 00:16:24.765 "abort": true, 00:16:24.765 "nvme_admin": false, 00:16:24.765 "nvme_io": false 00:16:24.765 }, 00:16:24.765 "memory_domains": [ 00:16:24.765 { 00:16:24.765 "dma_device_id": "system", 00:16:24.765 "dma_device_type": 1 00:16:24.765 }, 00:16:24.765 { 00:16:24.765 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:24.765 "dma_device_type": 2 00:16:24.765 } 00:16:24.765 ], 00:16:24.765 "driver_specific": {} 00:16:24.765 } 00:16:24.765 ] 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:24.765 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:25.023 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:25.023 "name": "Existed_Raid", 00:16:25.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.023 "strip_size_kb": 64, 00:16:25.023 "state": "configuring", 00:16:25.023 "raid_level": "raid0", 00:16:25.023 "superblock": false, 00:16:25.023 "num_base_bdevs": 4, 00:16:25.023 "num_base_bdevs_discovered": 1, 00:16:25.023 "num_base_bdevs_operational": 4, 00:16:25.023 "base_bdevs_list": [ 00:16:25.023 { 00:16:25.023 "name": "BaseBdev1", 00:16:25.023 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:25.023 "is_configured": true, 00:16:25.023 "data_offset": 0, 00:16:25.023 "data_size": 65536 00:16:25.023 }, 00:16:25.023 { 00:16:25.023 "name": "BaseBdev2", 00:16:25.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.023 "is_configured": false, 00:16:25.023 "data_offset": 0, 00:16:25.023 "data_size": 0 00:16:25.023 }, 00:16:25.023 { 00:16:25.023 "name": "BaseBdev3", 00:16:25.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.023 "is_configured": false, 00:16:25.023 "data_offset": 0, 00:16:25.023 "data_size": 0 00:16:25.023 }, 00:16:25.023 { 00:16:25.023 "name": "BaseBdev4", 00:16:25.023 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:25.023 "is_configured": false, 00:16:25.023 "data_offset": 0, 00:16:25.023 "data_size": 0 00:16:25.023 } 00:16:25.023 ] 00:16:25.023 }' 00:16:25.023 15:55:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:25.023 15:55:30 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:25.957 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:25.957 [2024-06-10 15:55:31.254071] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:25.957 [2024-06-10 15:55:31.254108] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12ad9b0 name Existed_Raid, state configuring 00:16:25.957 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:26.215 [2024-06-10 15:55:31.510785] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:26.215 [2024-06-10 15:55:31.512297] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:26.215 [2024-06-10 15:55:31.512327] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:26.215 [2024-06-10 15:55:31.512340] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:26.215 [2024-06-10 15:55:31.512349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:26.215 [2024-06-10 15:55:31.512356] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:26.215 [2024-06-10 15:55:31.512365] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:26.215 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:26.215 "name": "Existed_Raid", 00:16:26.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.215 "strip_size_kb": 64, 00:16:26.215 "state": "configuring", 00:16:26.215 "raid_level": "raid0", 00:16:26.215 "superblock": false, 00:16:26.215 "num_base_bdevs": 4, 00:16:26.215 "num_base_bdevs_discovered": 1, 00:16:26.215 "num_base_bdevs_operational": 4, 00:16:26.215 "base_bdevs_list": [ 00:16:26.215 { 00:16:26.215 "name": "BaseBdev1", 00:16:26.215 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:26.215 "is_configured": true, 00:16:26.215 "data_offset": 0, 00:16:26.215 "data_size": 65536 00:16:26.215 }, 00:16:26.215 { 00:16:26.215 "name": "BaseBdev2", 00:16:26.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.215 "is_configured": false, 00:16:26.216 "data_offset": 0, 00:16:26.216 "data_size": 0 00:16:26.216 }, 00:16:26.216 { 00:16:26.216 "name": "BaseBdev3", 00:16:26.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.216 "is_configured": false, 00:16:26.216 "data_offset": 0, 00:16:26.216 "data_size": 0 00:16:26.216 }, 00:16:26.216 { 00:16:26.216 "name": "BaseBdev4", 00:16:26.216 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:26.216 "is_configured": false, 00:16:26.216 "data_offset": 0, 00:16:26.216 "data_size": 0 00:16:26.216 } 00:16:26.216 ] 00:16:26.216 }' 00:16:26.216 15:55:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:26.216 15:55:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:27.150 [2024-06-10 15:55:32.480641] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:27.150 BaseBdev2 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:27.150 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:27.408 15:55:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:27.666 [ 00:16:27.666 { 00:16:27.666 "name": "BaseBdev2", 00:16:27.666 "aliases": [ 00:16:27.666 "39a92035-be8a-4dc3-9040-47fb01da5164" 00:16:27.666 ], 00:16:27.666 "product_name": "Malloc disk", 00:16:27.666 "block_size": 512, 00:16:27.666 "num_blocks": 65536, 00:16:27.666 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:27.666 "assigned_rate_limits": { 00:16:27.666 "rw_ios_per_sec": 0, 00:16:27.666 "rw_mbytes_per_sec": 0, 00:16:27.666 "r_mbytes_per_sec": 0, 00:16:27.666 "w_mbytes_per_sec": 0 00:16:27.666 }, 00:16:27.666 "claimed": true, 00:16:27.666 "claim_type": "exclusive_write", 00:16:27.666 "zoned": false, 00:16:27.666 "supported_io_types": { 00:16:27.666 "read": true, 00:16:27.666 "write": true, 00:16:27.666 "unmap": true, 00:16:27.666 "write_zeroes": true, 00:16:27.666 "flush": true, 00:16:27.666 "reset": true, 00:16:27.666 "compare": false, 00:16:27.666 "compare_and_write": false, 00:16:27.666 "abort": true, 00:16:27.666 "nvme_admin": false, 00:16:27.666 "nvme_io": false 00:16:27.666 }, 00:16:27.666 "memory_domains": [ 00:16:27.666 { 00:16:27.666 "dma_device_id": "system", 00:16:27.667 "dma_device_type": 1 00:16:27.667 }, 00:16:27.667 { 00:16:27.667 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:27.667 "dma_device_type": 2 00:16:27.667 } 00:16:27.667 ], 00:16:27.667 "driver_specific": {} 00:16:27.667 } 00:16:27.667 ] 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:27.667 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:27.925 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:27.925 "name": "Existed_Raid", 00:16:27.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.925 "strip_size_kb": 64, 00:16:27.925 "state": "configuring", 00:16:27.925 "raid_level": "raid0", 00:16:27.925 "superblock": false, 00:16:27.925 "num_base_bdevs": 4, 00:16:27.925 "num_base_bdevs_discovered": 2, 00:16:27.925 "num_base_bdevs_operational": 4, 00:16:27.925 "base_bdevs_list": [ 00:16:27.925 { 00:16:27.925 "name": "BaseBdev1", 00:16:27.925 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:27.925 "is_configured": true, 00:16:27.925 "data_offset": 0, 00:16:27.925 "data_size": 65536 00:16:27.925 }, 00:16:27.925 { 00:16:27.925 "name": "BaseBdev2", 00:16:27.925 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:27.925 "is_configured": true, 00:16:27.925 "data_offset": 0, 00:16:27.925 "data_size": 65536 00:16:27.925 }, 00:16:27.925 { 00:16:27.925 "name": "BaseBdev3", 00:16:27.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.925 "is_configured": false, 00:16:27.925 "data_offset": 0, 00:16:27.925 "data_size": 0 00:16:27.925 }, 00:16:27.925 { 00:16:27.925 "name": "BaseBdev4", 00:16:27.925 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:27.925 "is_configured": false, 00:16:27.925 "data_offset": 0, 00:16:27.925 "data_size": 0 00:16:27.925 } 00:16:27.925 ] 00:16:27.925 }' 00:16:27.925 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:27.925 15:55:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:28.492 15:55:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:28.492 [2024-06-10 15:55:33.991965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:28.492 BaseBdev3 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:28.750 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:29.009 [ 00:16:29.009 { 00:16:29.009 "name": "BaseBdev3", 00:16:29.009 "aliases": [ 00:16:29.009 "6b7cef49-2fa8-4641-8560-14c78da52053" 00:16:29.009 ], 00:16:29.009 "product_name": "Malloc disk", 00:16:29.009 "block_size": 512, 00:16:29.009 "num_blocks": 65536, 00:16:29.009 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:29.009 "assigned_rate_limits": { 00:16:29.009 "rw_ios_per_sec": 0, 00:16:29.009 "rw_mbytes_per_sec": 0, 00:16:29.009 "r_mbytes_per_sec": 0, 00:16:29.009 "w_mbytes_per_sec": 0 00:16:29.009 }, 00:16:29.009 "claimed": true, 00:16:29.009 "claim_type": "exclusive_write", 00:16:29.009 "zoned": false, 00:16:29.009 "supported_io_types": { 00:16:29.009 "read": true, 00:16:29.009 "write": true, 00:16:29.009 "unmap": true, 00:16:29.009 "write_zeroes": true, 00:16:29.009 "flush": true, 00:16:29.009 "reset": true, 00:16:29.009 "compare": false, 00:16:29.009 "compare_and_write": false, 00:16:29.009 "abort": true, 00:16:29.009 "nvme_admin": false, 00:16:29.009 "nvme_io": false 00:16:29.009 }, 00:16:29.009 "memory_domains": [ 00:16:29.009 { 00:16:29.009 "dma_device_id": "system", 00:16:29.009 "dma_device_type": 1 00:16:29.009 }, 00:16:29.009 { 00:16:29.009 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:29.009 "dma_device_type": 2 00:16:29.009 } 00:16:29.009 ], 00:16:29.009 "driver_specific": {} 00:16:29.009 } 00:16:29.009 ] 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:29.009 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:29.268 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:29.268 "name": "Existed_Raid", 00:16:29.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.268 "strip_size_kb": 64, 00:16:29.268 "state": "configuring", 00:16:29.268 "raid_level": "raid0", 00:16:29.268 "superblock": false, 00:16:29.268 "num_base_bdevs": 4, 00:16:29.268 "num_base_bdevs_discovered": 3, 00:16:29.268 "num_base_bdevs_operational": 4, 00:16:29.268 "base_bdevs_list": [ 00:16:29.268 { 00:16:29.268 "name": "BaseBdev1", 00:16:29.268 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:29.268 "is_configured": true, 00:16:29.268 "data_offset": 0, 00:16:29.268 "data_size": 65536 00:16:29.268 }, 00:16:29.268 { 00:16:29.268 "name": "BaseBdev2", 00:16:29.268 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:29.268 "is_configured": true, 00:16:29.268 "data_offset": 0, 00:16:29.268 "data_size": 65536 00:16:29.268 }, 00:16:29.268 { 00:16:29.268 "name": "BaseBdev3", 00:16:29.268 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:29.268 "is_configured": true, 00:16:29.268 "data_offset": 0, 00:16:29.268 "data_size": 65536 00:16:29.268 }, 00:16:29.268 { 00:16:29.268 "name": "BaseBdev4", 00:16:29.268 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:29.268 "is_configured": false, 00:16:29.268 "data_offset": 0, 00:16:29.268 "data_size": 0 00:16:29.268 } 00:16:29.268 ] 00:16:29.268 }' 00:16:29.268 15:55:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:29.268 15:55:34 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:29.831 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:30.089 [2024-06-10 15:55:35.435055] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:30.089 [2024-06-10 15:55:35.435085] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12aeac0 00:16:30.089 [2024-06-10 15:55:35.435091] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:30.089 [2024-06-10 15:55:35.435286] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1453ba0 00:16:30.089 [2024-06-10 15:55:35.435413] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12aeac0 00:16:30.089 [2024-06-10 15:55:35.435422] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x12aeac0 00:16:30.089 [2024-06-10 15:55:35.435585] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:30.089 BaseBdev4 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:30.089 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:30.347 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:30.605 [ 00:16:30.605 { 00:16:30.605 "name": "BaseBdev4", 00:16:30.605 "aliases": [ 00:16:30.605 "e08a10b0-5c9c-440f-a793-b8d20f6e0f30" 00:16:30.605 ], 00:16:30.605 "product_name": "Malloc disk", 00:16:30.605 "block_size": 512, 00:16:30.605 "num_blocks": 65536, 00:16:30.605 "uuid": "e08a10b0-5c9c-440f-a793-b8d20f6e0f30", 00:16:30.605 "assigned_rate_limits": { 00:16:30.605 "rw_ios_per_sec": 0, 00:16:30.605 "rw_mbytes_per_sec": 0, 00:16:30.605 "r_mbytes_per_sec": 0, 00:16:30.605 "w_mbytes_per_sec": 0 00:16:30.605 }, 00:16:30.605 "claimed": true, 00:16:30.605 "claim_type": "exclusive_write", 00:16:30.605 "zoned": false, 00:16:30.605 "supported_io_types": { 00:16:30.605 "read": true, 00:16:30.605 "write": true, 00:16:30.605 "unmap": true, 00:16:30.605 "write_zeroes": true, 00:16:30.605 "flush": true, 00:16:30.605 "reset": true, 00:16:30.605 "compare": false, 00:16:30.605 "compare_and_write": false, 00:16:30.606 "abort": true, 00:16:30.606 "nvme_admin": false, 00:16:30.606 "nvme_io": false 00:16:30.606 }, 00:16:30.606 "memory_domains": [ 00:16:30.606 { 00:16:30.606 "dma_device_id": "system", 00:16:30.606 "dma_device_type": 1 00:16:30.606 }, 00:16:30.606 { 00:16:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:30.606 "dma_device_type": 2 00:16:30.606 } 00:16:30.606 ], 00:16:30.606 "driver_specific": {} 00:16:30.606 } 00:16:30.606 ] 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:30.606 15:55:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:30.864 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:30.864 "name": "Existed_Raid", 00:16:30.864 "uuid": "6d354d6a-1b99-4b47-a051-307577d7eaad", 00:16:30.864 "strip_size_kb": 64, 00:16:30.864 "state": "online", 00:16:30.864 "raid_level": "raid0", 00:16:30.864 "superblock": false, 00:16:30.864 "num_base_bdevs": 4, 00:16:30.864 "num_base_bdevs_discovered": 4, 00:16:30.864 "num_base_bdevs_operational": 4, 00:16:30.864 "base_bdevs_list": [ 00:16:30.864 { 00:16:30.864 "name": "BaseBdev1", 00:16:30.864 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:30.864 "is_configured": true, 00:16:30.864 "data_offset": 0, 00:16:30.864 "data_size": 65536 00:16:30.864 }, 00:16:30.864 { 00:16:30.864 "name": "BaseBdev2", 00:16:30.864 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:30.864 "is_configured": true, 00:16:30.865 "data_offset": 0, 00:16:30.865 "data_size": 65536 00:16:30.865 }, 00:16:30.865 { 00:16:30.865 "name": "BaseBdev3", 00:16:30.865 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:30.865 "is_configured": true, 00:16:30.865 "data_offset": 0, 00:16:30.865 "data_size": 65536 00:16:30.865 }, 00:16:30.865 { 00:16:30.865 "name": "BaseBdev4", 00:16:30.865 "uuid": "e08a10b0-5c9c-440f-a793-b8d20f6e0f30", 00:16:30.865 "is_configured": true, 00:16:30.865 "data_offset": 0, 00:16:30.865 "data_size": 65536 00:16:30.865 } 00:16:30.865 ] 00:16:30.865 }' 00:16:30.865 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:30.865 15:55:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:31.430 15:55:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:31.688 [2024-06-10 15:55:37.071779] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:31.688 "name": "Existed_Raid", 00:16:31.688 "aliases": [ 00:16:31.688 "6d354d6a-1b99-4b47-a051-307577d7eaad" 00:16:31.688 ], 00:16:31.688 "product_name": "Raid Volume", 00:16:31.688 "block_size": 512, 00:16:31.688 "num_blocks": 262144, 00:16:31.688 "uuid": "6d354d6a-1b99-4b47-a051-307577d7eaad", 00:16:31.688 "assigned_rate_limits": { 00:16:31.688 "rw_ios_per_sec": 0, 00:16:31.688 "rw_mbytes_per_sec": 0, 00:16:31.688 "r_mbytes_per_sec": 0, 00:16:31.688 "w_mbytes_per_sec": 0 00:16:31.688 }, 00:16:31.688 "claimed": false, 00:16:31.688 "zoned": false, 00:16:31.688 "supported_io_types": { 00:16:31.688 "read": true, 00:16:31.688 "write": true, 00:16:31.688 "unmap": true, 00:16:31.688 "write_zeroes": true, 00:16:31.688 "flush": true, 00:16:31.688 "reset": true, 00:16:31.688 "compare": false, 00:16:31.688 "compare_and_write": false, 00:16:31.688 "abort": false, 00:16:31.688 "nvme_admin": false, 00:16:31.688 "nvme_io": false 00:16:31.688 }, 00:16:31.688 "memory_domains": [ 00:16:31.688 { 00:16:31.688 "dma_device_id": "system", 00:16:31.688 "dma_device_type": 1 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.688 "dma_device_type": 2 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "system", 00:16:31.688 "dma_device_type": 1 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.688 "dma_device_type": 2 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "system", 00:16:31.688 "dma_device_type": 1 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.688 "dma_device_type": 2 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "system", 00:16:31.688 "dma_device_type": 1 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.688 "dma_device_type": 2 00:16:31.688 } 00:16:31.688 ], 00:16:31.688 "driver_specific": { 00:16:31.688 "raid": { 00:16:31.688 "uuid": "6d354d6a-1b99-4b47-a051-307577d7eaad", 00:16:31.688 "strip_size_kb": 64, 00:16:31.688 "state": "online", 00:16:31.688 "raid_level": "raid0", 00:16:31.688 "superblock": false, 00:16:31.688 "num_base_bdevs": 4, 00:16:31.688 "num_base_bdevs_discovered": 4, 00:16:31.688 "num_base_bdevs_operational": 4, 00:16:31.688 "base_bdevs_list": [ 00:16:31.688 { 00:16:31.688 "name": "BaseBdev1", 00:16:31.688 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:31.688 "is_configured": true, 00:16:31.688 "data_offset": 0, 00:16:31.688 "data_size": 65536 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "name": "BaseBdev2", 00:16:31.688 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:31.688 "is_configured": true, 00:16:31.688 "data_offset": 0, 00:16:31.688 "data_size": 65536 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "name": "BaseBdev3", 00:16:31.688 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:31.688 "is_configured": true, 00:16:31.688 "data_offset": 0, 00:16:31.688 "data_size": 65536 00:16:31.688 }, 00:16:31.688 { 00:16:31.688 "name": "BaseBdev4", 00:16:31.688 "uuid": "e08a10b0-5c9c-440f-a793-b8d20f6e0f30", 00:16:31.688 "is_configured": true, 00:16:31.688 "data_offset": 0, 00:16:31.688 "data_size": 65536 00:16:31.688 } 00:16:31.688 ] 00:16:31.688 } 00:16:31.688 } 00:16:31.688 }' 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:16:31.688 BaseBdev2 00:16:31.688 BaseBdev3 00:16:31.688 BaseBdev4' 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:16:31.688 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:31.946 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:31.946 "name": "BaseBdev1", 00:16:31.946 "aliases": [ 00:16:31.946 "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e" 00:16:31.946 ], 00:16:31.946 "product_name": "Malloc disk", 00:16:31.946 "block_size": 512, 00:16:31.946 "num_blocks": 65536, 00:16:31.946 "uuid": "d01e45a6-7dca-47a9-9d7c-f472f36bbd5e", 00:16:31.946 "assigned_rate_limits": { 00:16:31.946 "rw_ios_per_sec": 0, 00:16:31.946 "rw_mbytes_per_sec": 0, 00:16:31.946 "r_mbytes_per_sec": 0, 00:16:31.946 "w_mbytes_per_sec": 0 00:16:31.946 }, 00:16:31.946 "claimed": true, 00:16:31.946 "claim_type": "exclusive_write", 00:16:31.946 "zoned": false, 00:16:31.946 "supported_io_types": { 00:16:31.946 "read": true, 00:16:31.946 "write": true, 00:16:31.946 "unmap": true, 00:16:31.946 "write_zeroes": true, 00:16:31.946 "flush": true, 00:16:31.946 "reset": true, 00:16:31.946 "compare": false, 00:16:31.946 "compare_and_write": false, 00:16:31.946 "abort": true, 00:16:31.946 "nvme_admin": false, 00:16:31.946 "nvme_io": false 00:16:31.946 }, 00:16:31.946 "memory_domains": [ 00:16:31.946 { 00:16:31.946 "dma_device_id": "system", 00:16:31.946 "dma_device_type": 1 00:16:31.946 }, 00:16:31.946 { 00:16:31.946 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:31.946 "dma_device_type": 2 00:16:31.946 } 00:16:31.946 ], 00:16:31.946 "driver_specific": {} 00:16:31.946 }' 00:16:31.946 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:31.946 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.203 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.460 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.461 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.461 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.461 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:32.461 15:55:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:32.718 "name": "BaseBdev2", 00:16:32.718 "aliases": [ 00:16:32.718 "39a92035-be8a-4dc3-9040-47fb01da5164" 00:16:32.718 ], 00:16:32.718 "product_name": "Malloc disk", 00:16:32.718 "block_size": 512, 00:16:32.718 "num_blocks": 65536, 00:16:32.718 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:32.718 "assigned_rate_limits": { 00:16:32.718 "rw_ios_per_sec": 0, 00:16:32.718 "rw_mbytes_per_sec": 0, 00:16:32.718 "r_mbytes_per_sec": 0, 00:16:32.718 "w_mbytes_per_sec": 0 00:16:32.718 }, 00:16:32.718 "claimed": true, 00:16:32.718 "claim_type": "exclusive_write", 00:16:32.718 "zoned": false, 00:16:32.718 "supported_io_types": { 00:16:32.718 "read": true, 00:16:32.718 "write": true, 00:16:32.718 "unmap": true, 00:16:32.718 "write_zeroes": true, 00:16:32.718 "flush": true, 00:16:32.718 "reset": true, 00:16:32.718 "compare": false, 00:16:32.718 "compare_and_write": false, 00:16:32.718 "abort": true, 00:16:32.718 "nvme_admin": false, 00:16:32.718 "nvme_io": false 00:16:32.718 }, 00:16:32.718 "memory_domains": [ 00:16:32.718 { 00:16:32.718 "dma_device_id": "system", 00:16:32.718 "dma_device_type": 1 00:16:32.718 }, 00:16:32.718 { 00:16:32.718 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:32.718 "dma_device_type": 2 00:16:32.718 } 00:16:32.718 ], 00:16:32.718 "driver_specific": {} 00:16:32.718 }' 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:32.718 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:32.977 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.235 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.235 "name": "BaseBdev3", 00:16:33.235 "aliases": [ 00:16:33.235 "6b7cef49-2fa8-4641-8560-14c78da52053" 00:16:33.235 ], 00:16:33.235 "product_name": "Malloc disk", 00:16:33.235 "block_size": 512, 00:16:33.235 "num_blocks": 65536, 00:16:33.235 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:33.235 "assigned_rate_limits": { 00:16:33.235 "rw_ios_per_sec": 0, 00:16:33.235 "rw_mbytes_per_sec": 0, 00:16:33.235 "r_mbytes_per_sec": 0, 00:16:33.235 "w_mbytes_per_sec": 0 00:16:33.235 }, 00:16:33.235 "claimed": true, 00:16:33.235 "claim_type": "exclusive_write", 00:16:33.235 "zoned": false, 00:16:33.235 "supported_io_types": { 00:16:33.235 "read": true, 00:16:33.235 "write": true, 00:16:33.235 "unmap": true, 00:16:33.235 "write_zeroes": true, 00:16:33.235 "flush": true, 00:16:33.235 "reset": true, 00:16:33.235 "compare": false, 00:16:33.235 "compare_and_write": false, 00:16:33.235 "abort": true, 00:16:33.235 "nvme_admin": false, 00:16:33.235 "nvme_io": false 00:16:33.235 }, 00:16:33.235 "memory_domains": [ 00:16:33.235 { 00:16:33.235 "dma_device_id": "system", 00:16:33.235 "dma_device_type": 1 00:16:33.235 }, 00:16:33.235 { 00:16:33.235 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.235 "dma_device_type": 2 00:16:33.235 } 00:16:33.235 ], 00:16:33.235 "driver_specific": {} 00:16:33.235 }' 00:16:33.235 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.235 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.235 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.235 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.495 15:55:38 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:33.820 "name": "BaseBdev4", 00:16:33.820 "aliases": [ 00:16:33.820 "e08a10b0-5c9c-440f-a793-b8d20f6e0f30" 00:16:33.820 ], 00:16:33.820 "product_name": "Malloc disk", 00:16:33.820 "block_size": 512, 00:16:33.820 "num_blocks": 65536, 00:16:33.820 "uuid": "e08a10b0-5c9c-440f-a793-b8d20f6e0f30", 00:16:33.820 "assigned_rate_limits": { 00:16:33.820 "rw_ios_per_sec": 0, 00:16:33.820 "rw_mbytes_per_sec": 0, 00:16:33.820 "r_mbytes_per_sec": 0, 00:16:33.820 "w_mbytes_per_sec": 0 00:16:33.820 }, 00:16:33.820 "claimed": true, 00:16:33.820 "claim_type": "exclusive_write", 00:16:33.820 "zoned": false, 00:16:33.820 "supported_io_types": { 00:16:33.820 "read": true, 00:16:33.820 "write": true, 00:16:33.820 "unmap": true, 00:16:33.820 "write_zeroes": true, 00:16:33.820 "flush": true, 00:16:33.820 "reset": true, 00:16:33.820 "compare": false, 00:16:33.820 "compare_and_write": false, 00:16:33.820 "abort": true, 00:16:33.820 "nvme_admin": false, 00:16:33.820 "nvme_io": false 00:16:33.820 }, 00:16:33.820 "memory_domains": [ 00:16:33.820 { 00:16:33.820 "dma_device_id": "system", 00:16:33.820 "dma_device_type": 1 00:16:33.820 }, 00:16:33.820 { 00:16:33.820 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:33.820 "dma_device_type": 2 00:16:33.820 } 00:16:33.820 ], 00:16:33.820 "driver_specific": {} 00:16:33.820 }' 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:33.820 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:34.079 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:34.337 [2024-06-10 15:55:39.770740] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:34.337 [2024-06-10 15:55:39.770766] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:34.337 [2024-06-10 15:55:39.770811] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:34.337 15:55:39 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:34.595 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:34.595 "name": "Existed_Raid", 00:16:34.595 "uuid": "6d354d6a-1b99-4b47-a051-307577d7eaad", 00:16:34.595 "strip_size_kb": 64, 00:16:34.595 "state": "offline", 00:16:34.595 "raid_level": "raid0", 00:16:34.595 "superblock": false, 00:16:34.595 "num_base_bdevs": 4, 00:16:34.595 "num_base_bdevs_discovered": 3, 00:16:34.595 "num_base_bdevs_operational": 3, 00:16:34.595 "base_bdevs_list": [ 00:16:34.595 { 00:16:34.595 "name": null, 00:16:34.595 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:34.595 "is_configured": false, 00:16:34.595 "data_offset": 0, 00:16:34.595 "data_size": 65536 00:16:34.595 }, 00:16:34.595 { 00:16:34.595 "name": "BaseBdev2", 00:16:34.595 "uuid": "39a92035-be8a-4dc3-9040-47fb01da5164", 00:16:34.595 "is_configured": true, 00:16:34.595 "data_offset": 0, 00:16:34.595 "data_size": 65536 00:16:34.595 }, 00:16:34.595 { 00:16:34.595 "name": "BaseBdev3", 00:16:34.595 "uuid": "6b7cef49-2fa8-4641-8560-14c78da52053", 00:16:34.595 "is_configured": true, 00:16:34.595 "data_offset": 0, 00:16:34.595 "data_size": 65536 00:16:34.595 }, 00:16:34.595 { 00:16:34.595 "name": "BaseBdev4", 00:16:34.595 "uuid": "e08a10b0-5c9c-440f-a793-b8d20f6e0f30", 00:16:34.595 "is_configured": true, 00:16:34.595 "data_offset": 0, 00:16:34.595 "data_size": 65536 00:16:34.595 } 00:16:34.595 ] 00:16:34.595 }' 00:16:34.595 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:34.595 15:55:40 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:35.160 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:16:35.160 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:35.160 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.160 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:35.418 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:35.418 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:35.418 15:55:40 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:16:35.677 [2024-06-10 15:55:41.123579] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:35.677 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:35.677 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:35.677 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:35.677 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:35.935 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:35.935 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:35.935 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:16:36.193 [2024-06-10 15:55:41.647680] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:36.193 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:36.193 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:36.193 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.193 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:16:36.451 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:16:36.451 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:16:36.451 15:55:41 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:16:36.710 [2024-06-10 15:55:42.167567] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:16:36.710 [2024-06-10 15:55:42.167607] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12aeac0 name Existed_Raid, state offline 00:16:36.710 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:16:36.710 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:16:36.710 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:36.710 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:36.967 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:16:37.225 BaseBdev2 00:16:37.225 15:55:42 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:37.226 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:37.483 15:55:42 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:16:37.741 [ 00:16:37.741 { 00:16:37.741 "name": "BaseBdev2", 00:16:37.741 "aliases": [ 00:16:37.741 "2c1a39ad-57f5-442f-b91e-5301ca4c226e" 00:16:37.741 ], 00:16:37.741 "product_name": "Malloc disk", 00:16:37.741 "block_size": 512, 00:16:37.741 "num_blocks": 65536, 00:16:37.741 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:37.741 "assigned_rate_limits": { 00:16:37.741 "rw_ios_per_sec": 0, 00:16:37.741 "rw_mbytes_per_sec": 0, 00:16:37.741 "r_mbytes_per_sec": 0, 00:16:37.741 "w_mbytes_per_sec": 0 00:16:37.741 }, 00:16:37.741 "claimed": false, 00:16:37.741 "zoned": false, 00:16:37.741 "supported_io_types": { 00:16:37.741 "read": true, 00:16:37.741 "write": true, 00:16:37.741 "unmap": true, 00:16:37.741 "write_zeroes": true, 00:16:37.741 "flush": true, 00:16:37.741 "reset": true, 00:16:37.741 "compare": false, 00:16:37.742 "compare_and_write": false, 00:16:37.742 "abort": true, 00:16:37.742 "nvme_admin": false, 00:16:37.742 "nvme_io": false 00:16:37.742 }, 00:16:37.742 "memory_domains": [ 00:16:37.742 { 00:16:37.742 "dma_device_id": "system", 00:16:37.742 "dma_device_type": 1 00:16:37.742 }, 00:16:37.742 { 00:16:37.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:37.742 "dma_device_type": 2 00:16:37.742 } 00:16:37.742 ], 00:16:37.742 "driver_specific": {} 00:16:37.742 } 00:16:37.742 ] 00:16:37.742 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:37.742 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:37.742 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:37.742 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:16:38.000 BaseBdev3 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:38.000 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:38.257 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:16:38.515 [ 00:16:38.516 { 00:16:38.516 "name": "BaseBdev3", 00:16:38.516 "aliases": [ 00:16:38.516 "ce47a01d-9ce3-4354-949f-c7b2f827c59f" 00:16:38.516 ], 00:16:38.516 "product_name": "Malloc disk", 00:16:38.516 "block_size": 512, 00:16:38.516 "num_blocks": 65536, 00:16:38.516 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:38.516 "assigned_rate_limits": { 00:16:38.516 "rw_ios_per_sec": 0, 00:16:38.516 "rw_mbytes_per_sec": 0, 00:16:38.516 "r_mbytes_per_sec": 0, 00:16:38.516 "w_mbytes_per_sec": 0 00:16:38.516 }, 00:16:38.516 "claimed": false, 00:16:38.516 "zoned": false, 00:16:38.516 "supported_io_types": { 00:16:38.516 "read": true, 00:16:38.516 "write": true, 00:16:38.516 "unmap": true, 00:16:38.516 "write_zeroes": true, 00:16:38.516 "flush": true, 00:16:38.516 "reset": true, 00:16:38.516 "compare": false, 00:16:38.516 "compare_and_write": false, 00:16:38.516 "abort": true, 00:16:38.516 "nvme_admin": false, 00:16:38.516 "nvme_io": false 00:16:38.516 }, 00:16:38.516 "memory_domains": [ 00:16:38.516 { 00:16:38.516 "dma_device_id": "system", 00:16:38.516 "dma_device_type": 1 00:16:38.516 }, 00:16:38.516 { 00:16:38.516 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:38.516 "dma_device_type": 2 00:16:38.516 } 00:16:38.516 ], 00:16:38.516 "driver_specific": {} 00:16:38.516 } 00:16:38.516 ] 00:16:38.516 15:55:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:38.516 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:38.516 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:38.516 15:55:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:16:38.774 BaseBdev4 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:38.774 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:39.032 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:16:39.290 [ 00:16:39.290 { 00:16:39.290 "name": "BaseBdev4", 00:16:39.290 "aliases": [ 00:16:39.290 "268bb22b-db16-4e05-8618-9ca9676444ed" 00:16:39.290 ], 00:16:39.290 "product_name": "Malloc disk", 00:16:39.290 "block_size": 512, 00:16:39.290 "num_blocks": 65536, 00:16:39.290 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:39.290 "assigned_rate_limits": { 00:16:39.290 "rw_ios_per_sec": 0, 00:16:39.290 "rw_mbytes_per_sec": 0, 00:16:39.290 "r_mbytes_per_sec": 0, 00:16:39.290 "w_mbytes_per_sec": 0 00:16:39.290 }, 00:16:39.290 "claimed": false, 00:16:39.290 "zoned": false, 00:16:39.290 "supported_io_types": { 00:16:39.290 "read": true, 00:16:39.290 "write": true, 00:16:39.290 "unmap": true, 00:16:39.290 "write_zeroes": true, 00:16:39.290 "flush": true, 00:16:39.290 "reset": true, 00:16:39.290 "compare": false, 00:16:39.290 "compare_and_write": false, 00:16:39.290 "abort": true, 00:16:39.290 "nvme_admin": false, 00:16:39.290 "nvme_io": false 00:16:39.290 }, 00:16:39.290 "memory_domains": [ 00:16:39.290 { 00:16:39.290 "dma_device_id": "system", 00:16:39.290 "dma_device_type": 1 00:16:39.290 }, 00:16:39.290 { 00:16:39.290 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:39.290 "dma_device_type": 2 00:16:39.290 } 00:16:39.290 ], 00:16:39.290 "driver_specific": {} 00:16:39.290 } 00:16:39.290 ] 00:16:39.290 15:55:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:39.290 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:16:39.290 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:16:39.290 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:39.548 [2024-06-10 15:55:44.966622] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:39.548 [2024-06-10 15:55:44.966659] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:39.548 [2024-06-10 15:55:44.966676] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:39.548 [2024-06-10 15:55:44.968091] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:39.548 [2024-06-10 15:55:44.968133] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:39.548 15:55:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:39.806 15:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:39.806 "name": "Existed_Raid", 00:16:39.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.806 "strip_size_kb": 64, 00:16:39.806 "state": "configuring", 00:16:39.806 "raid_level": "raid0", 00:16:39.806 "superblock": false, 00:16:39.806 "num_base_bdevs": 4, 00:16:39.806 "num_base_bdevs_discovered": 3, 00:16:39.806 "num_base_bdevs_operational": 4, 00:16:39.806 "base_bdevs_list": [ 00:16:39.806 { 00:16:39.806 "name": "BaseBdev1", 00:16:39.806 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:39.806 "is_configured": false, 00:16:39.806 "data_offset": 0, 00:16:39.806 "data_size": 0 00:16:39.806 }, 00:16:39.806 { 00:16:39.806 "name": "BaseBdev2", 00:16:39.806 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:39.806 "is_configured": true, 00:16:39.806 "data_offset": 0, 00:16:39.806 "data_size": 65536 00:16:39.806 }, 00:16:39.806 { 00:16:39.806 "name": "BaseBdev3", 00:16:39.806 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:39.806 "is_configured": true, 00:16:39.806 "data_offset": 0, 00:16:39.806 "data_size": 65536 00:16:39.806 }, 00:16:39.806 { 00:16:39.806 "name": "BaseBdev4", 00:16:39.806 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:39.806 "is_configured": true, 00:16:39.806 "data_offset": 0, 00:16:39.807 "data_size": 65536 00:16:39.807 } 00:16:39.807 ] 00:16:39.807 }' 00:16:39.807 15:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:39.807 15:55:45 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:40.372 15:55:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:16:40.630 [2024-06-10 15:55:46.085673] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:40.630 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:40.888 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:40.888 "name": "Existed_Raid", 00:16:40.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.888 "strip_size_kb": 64, 00:16:40.888 "state": "configuring", 00:16:40.888 "raid_level": "raid0", 00:16:40.888 "superblock": false, 00:16:40.888 "num_base_bdevs": 4, 00:16:40.888 "num_base_bdevs_discovered": 2, 00:16:40.888 "num_base_bdevs_operational": 4, 00:16:40.888 "base_bdevs_list": [ 00:16:40.888 { 00:16:40.888 "name": "BaseBdev1", 00:16:40.888 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:40.888 "is_configured": false, 00:16:40.888 "data_offset": 0, 00:16:40.888 "data_size": 0 00:16:40.888 }, 00:16:40.888 { 00:16:40.888 "name": null, 00:16:40.888 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:40.888 "is_configured": false, 00:16:40.888 "data_offset": 0, 00:16:40.888 "data_size": 65536 00:16:40.888 }, 00:16:40.888 { 00:16:40.888 "name": "BaseBdev3", 00:16:40.888 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:40.888 "is_configured": true, 00:16:40.888 "data_offset": 0, 00:16:40.888 "data_size": 65536 00:16:40.888 }, 00:16:40.888 { 00:16:40.888 "name": "BaseBdev4", 00:16:40.888 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:40.888 "is_configured": true, 00:16:40.888 "data_offset": 0, 00:16:40.888 "data_size": 65536 00:16:40.888 } 00:16:40.888 ] 00:16:40.888 }' 00:16:40.888 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:40.888 15:55:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:41.823 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:41.823 15:55:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:41.823 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:16:41.823 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:42.082 [2024-06-10 15:55:47.480558] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:42.082 BaseBdev1 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:42.082 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:42.340 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:42.599 [ 00:16:42.599 { 00:16:42.599 "name": "BaseBdev1", 00:16:42.599 "aliases": [ 00:16:42.599 "7f9bc28c-e4ac-4434-9456-cf898ec622f9" 00:16:42.599 ], 00:16:42.599 "product_name": "Malloc disk", 00:16:42.599 "block_size": 512, 00:16:42.599 "num_blocks": 65536, 00:16:42.599 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:42.599 "assigned_rate_limits": { 00:16:42.599 "rw_ios_per_sec": 0, 00:16:42.599 "rw_mbytes_per_sec": 0, 00:16:42.599 "r_mbytes_per_sec": 0, 00:16:42.599 "w_mbytes_per_sec": 0 00:16:42.599 }, 00:16:42.599 "claimed": true, 00:16:42.599 "claim_type": "exclusive_write", 00:16:42.599 "zoned": false, 00:16:42.599 "supported_io_types": { 00:16:42.599 "read": true, 00:16:42.599 "write": true, 00:16:42.599 "unmap": true, 00:16:42.599 "write_zeroes": true, 00:16:42.599 "flush": true, 00:16:42.599 "reset": true, 00:16:42.599 "compare": false, 00:16:42.599 "compare_and_write": false, 00:16:42.599 "abort": true, 00:16:42.599 "nvme_admin": false, 00:16:42.599 "nvme_io": false 00:16:42.599 }, 00:16:42.599 "memory_domains": [ 00:16:42.599 { 00:16:42.599 "dma_device_id": "system", 00:16:42.599 "dma_device_type": 1 00:16:42.599 }, 00:16:42.599 { 00:16:42.599 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:42.599 "dma_device_type": 2 00:16:42.599 } 00:16:42.599 ], 00:16:42.599 "driver_specific": {} 00:16:42.599 } 00:16:42.599 ] 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:42.599 15:55:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:42.599 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:42.599 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:42.858 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:42.858 "name": "Existed_Raid", 00:16:42.858 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:42.858 "strip_size_kb": 64, 00:16:42.858 "state": "configuring", 00:16:42.858 "raid_level": "raid0", 00:16:42.858 "superblock": false, 00:16:42.858 "num_base_bdevs": 4, 00:16:42.858 "num_base_bdevs_discovered": 3, 00:16:42.858 "num_base_bdevs_operational": 4, 00:16:42.858 "base_bdevs_list": [ 00:16:42.858 { 00:16:42.858 "name": "BaseBdev1", 00:16:42.858 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:42.858 "is_configured": true, 00:16:42.858 "data_offset": 0, 00:16:42.858 "data_size": 65536 00:16:42.858 }, 00:16:42.858 { 00:16:42.858 "name": null, 00:16:42.858 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:42.858 "is_configured": false, 00:16:42.858 "data_offset": 0, 00:16:42.858 "data_size": 65536 00:16:42.858 }, 00:16:42.858 { 00:16:42.858 "name": "BaseBdev3", 00:16:42.858 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:42.858 "is_configured": true, 00:16:42.858 "data_offset": 0, 00:16:42.858 "data_size": 65536 00:16:42.858 }, 00:16:42.858 { 00:16:42.858 "name": "BaseBdev4", 00:16:42.858 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:42.858 "is_configured": true, 00:16:42.858 "data_offset": 0, 00:16:42.858 "data_size": 65536 00:16:42.858 } 00:16:42.858 ] 00:16:42.858 }' 00:16:42.858 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:42.858 15:55:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:43.425 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:43.425 15:55:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.683 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:16:43.683 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:16:43.942 [2024-06-10 15:55:49.349606] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:43.942 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:44.202 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:44.202 "name": "Existed_Raid", 00:16:44.202 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:44.202 "strip_size_kb": 64, 00:16:44.202 "state": "configuring", 00:16:44.202 "raid_level": "raid0", 00:16:44.202 "superblock": false, 00:16:44.202 "num_base_bdevs": 4, 00:16:44.202 "num_base_bdevs_discovered": 2, 00:16:44.202 "num_base_bdevs_operational": 4, 00:16:44.202 "base_bdevs_list": [ 00:16:44.202 { 00:16:44.202 "name": "BaseBdev1", 00:16:44.202 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:44.202 "is_configured": true, 00:16:44.202 "data_offset": 0, 00:16:44.202 "data_size": 65536 00:16:44.202 }, 00:16:44.202 { 00:16:44.202 "name": null, 00:16:44.202 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:44.202 "is_configured": false, 00:16:44.202 "data_offset": 0, 00:16:44.202 "data_size": 65536 00:16:44.202 }, 00:16:44.202 { 00:16:44.202 "name": null, 00:16:44.202 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:44.202 "is_configured": false, 00:16:44.202 "data_offset": 0, 00:16:44.202 "data_size": 65536 00:16:44.202 }, 00:16:44.202 { 00:16:44.202 "name": "BaseBdev4", 00:16:44.202 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:44.202 "is_configured": true, 00:16:44.202 "data_offset": 0, 00:16:44.202 "data_size": 65536 00:16:44.202 } 00:16:44.202 ] 00:16:44.202 }' 00:16:44.202 15:55:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:44.202 15:55:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:44.770 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:44.770 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.028 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:16:45.028 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:16:45.286 [2024-06-10 15:55:50.749358] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:45.286 15:55:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:45.545 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:45.545 "name": "Existed_Raid", 00:16:45.545 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:45.545 "strip_size_kb": 64, 00:16:45.545 "state": "configuring", 00:16:45.545 "raid_level": "raid0", 00:16:45.545 "superblock": false, 00:16:45.545 "num_base_bdevs": 4, 00:16:45.545 "num_base_bdevs_discovered": 3, 00:16:45.545 "num_base_bdevs_operational": 4, 00:16:45.545 "base_bdevs_list": [ 00:16:45.545 { 00:16:45.545 "name": "BaseBdev1", 00:16:45.545 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:45.545 "is_configured": true, 00:16:45.545 "data_offset": 0, 00:16:45.545 "data_size": 65536 00:16:45.545 }, 00:16:45.545 { 00:16:45.545 "name": null, 00:16:45.545 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:45.545 "is_configured": false, 00:16:45.545 "data_offset": 0, 00:16:45.545 "data_size": 65536 00:16:45.545 }, 00:16:45.545 { 00:16:45.545 "name": "BaseBdev3", 00:16:45.545 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:45.545 "is_configured": true, 00:16:45.545 "data_offset": 0, 00:16:45.545 "data_size": 65536 00:16:45.545 }, 00:16:45.545 { 00:16:45.545 "name": "BaseBdev4", 00:16:45.545 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:45.545 "is_configured": true, 00:16:45.545 "data_offset": 0, 00:16:45.545 "data_size": 65536 00:16:45.545 } 00:16:45.545 ] 00:16:45.545 }' 00:16:45.545 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:45.545 15:55:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:46.481 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.481 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:16:46.481 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:16:46.481 15:55:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:16:46.740 [2024-06-10 15:55:52.044861] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:46.740 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:46.998 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:46.998 "name": "Existed_Raid", 00:16:46.998 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:46.998 "strip_size_kb": 64, 00:16:46.998 "state": "configuring", 00:16:46.998 "raid_level": "raid0", 00:16:46.999 "superblock": false, 00:16:46.999 "num_base_bdevs": 4, 00:16:46.999 "num_base_bdevs_discovered": 2, 00:16:46.999 "num_base_bdevs_operational": 4, 00:16:46.999 "base_bdevs_list": [ 00:16:46.999 { 00:16:46.999 "name": null, 00:16:46.999 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:46.999 "is_configured": false, 00:16:46.999 "data_offset": 0, 00:16:46.999 "data_size": 65536 00:16:46.999 }, 00:16:46.999 { 00:16:46.999 "name": null, 00:16:46.999 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:46.999 "is_configured": false, 00:16:46.999 "data_offset": 0, 00:16:46.999 "data_size": 65536 00:16:46.999 }, 00:16:46.999 { 00:16:46.999 "name": "BaseBdev3", 00:16:46.999 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:46.999 "is_configured": true, 00:16:46.999 "data_offset": 0, 00:16:46.999 "data_size": 65536 00:16:46.999 }, 00:16:46.999 { 00:16:46.999 "name": "BaseBdev4", 00:16:46.999 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:46.999 "is_configured": true, 00:16:46.999 "data_offset": 0, 00:16:46.999 "data_size": 65536 00:16:46.999 } 00:16:46.999 ] 00:16:46.999 }' 00:16:46.999 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:46.999 15:55:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:47.565 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:47.565 15:55:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:16:47.824 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:16:47.824 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:16:48.117 [2024-06-10 15:55:53.451157] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:16:48.117 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:48.118 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:48.376 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:48.376 "name": "Existed_Raid", 00:16:48.376 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:48.376 "strip_size_kb": 64, 00:16:48.376 "state": "configuring", 00:16:48.376 "raid_level": "raid0", 00:16:48.376 "superblock": false, 00:16:48.376 "num_base_bdevs": 4, 00:16:48.376 "num_base_bdevs_discovered": 3, 00:16:48.376 "num_base_bdevs_operational": 4, 00:16:48.376 "base_bdevs_list": [ 00:16:48.376 { 00:16:48.376 "name": null, 00:16:48.376 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:48.376 "is_configured": false, 00:16:48.376 "data_offset": 0, 00:16:48.376 "data_size": 65536 00:16:48.376 }, 00:16:48.376 { 00:16:48.376 "name": "BaseBdev2", 00:16:48.376 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:48.376 "is_configured": true, 00:16:48.376 "data_offset": 0, 00:16:48.376 "data_size": 65536 00:16:48.376 }, 00:16:48.376 { 00:16:48.376 "name": "BaseBdev3", 00:16:48.376 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:48.376 "is_configured": true, 00:16:48.376 "data_offset": 0, 00:16:48.376 "data_size": 65536 00:16:48.376 }, 00:16:48.376 { 00:16:48.376 "name": "BaseBdev4", 00:16:48.376 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:48.376 "is_configured": true, 00:16:48.376 "data_offset": 0, 00:16:48.376 "data_size": 65536 00:16:48.376 } 00:16:48.376 ] 00:16:48.376 }' 00:16:48.376 15:55:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:48.376 15:55:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:48.942 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:16:48.942 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.199 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:16:49.199 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:49.199 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:16:49.457 15:55:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 7f9bc28c-e4ac-4434-9456-cf898ec622f9 00:16:49.715 [2024-06-10 15:55:55.098849] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:16:49.715 [2024-06-10 15:55:55.098882] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1454dd0 00:16:49.715 [2024-06-10 15:55:55.098889] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:16:49.715 [2024-06-10 15:55:55.099094] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x14593e0 00:16:49.715 [2024-06-10 15:55:55.099216] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1454dd0 00:16:49.715 [2024-06-10 15:55:55.099224] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1454dd0 00:16:49.715 [2024-06-10 15:55:55.099380] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:16:49.715 NewBaseBdev 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:49.715 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:49.973 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:16:50.230 [ 00:16:50.230 { 00:16:50.230 "name": "NewBaseBdev", 00:16:50.230 "aliases": [ 00:16:50.230 "7f9bc28c-e4ac-4434-9456-cf898ec622f9" 00:16:50.230 ], 00:16:50.230 "product_name": "Malloc disk", 00:16:50.230 "block_size": 512, 00:16:50.230 "num_blocks": 65536, 00:16:50.230 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:50.230 "assigned_rate_limits": { 00:16:50.230 "rw_ios_per_sec": 0, 00:16:50.230 "rw_mbytes_per_sec": 0, 00:16:50.230 "r_mbytes_per_sec": 0, 00:16:50.230 "w_mbytes_per_sec": 0 00:16:50.230 }, 00:16:50.230 "claimed": true, 00:16:50.230 "claim_type": "exclusive_write", 00:16:50.230 "zoned": false, 00:16:50.230 "supported_io_types": { 00:16:50.230 "read": true, 00:16:50.230 "write": true, 00:16:50.230 "unmap": true, 00:16:50.230 "write_zeroes": true, 00:16:50.230 "flush": true, 00:16:50.230 "reset": true, 00:16:50.230 "compare": false, 00:16:50.230 "compare_and_write": false, 00:16:50.230 "abort": true, 00:16:50.230 "nvme_admin": false, 00:16:50.230 "nvme_io": false 00:16:50.230 }, 00:16:50.230 "memory_domains": [ 00:16:50.230 { 00:16:50.230 "dma_device_id": "system", 00:16:50.230 "dma_device_type": 1 00:16:50.230 }, 00:16:50.230 { 00:16:50.230 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:50.230 "dma_device_type": 2 00:16:50.230 } 00:16:50.230 ], 00:16:50.230 "driver_specific": {} 00:16:50.230 } 00:16:50.230 ] 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:50.230 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:50.488 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:50.488 "name": "Existed_Raid", 00:16:50.488 "uuid": "0072864b-27d7-43e4-90b0-f04d4a28879b", 00:16:50.488 "strip_size_kb": 64, 00:16:50.488 "state": "online", 00:16:50.488 "raid_level": "raid0", 00:16:50.488 "superblock": false, 00:16:50.488 "num_base_bdevs": 4, 00:16:50.488 "num_base_bdevs_discovered": 4, 00:16:50.488 "num_base_bdevs_operational": 4, 00:16:50.488 "base_bdevs_list": [ 00:16:50.488 { 00:16:50.488 "name": "NewBaseBdev", 00:16:50.488 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:50.488 "is_configured": true, 00:16:50.488 "data_offset": 0, 00:16:50.488 "data_size": 65536 00:16:50.488 }, 00:16:50.488 { 00:16:50.488 "name": "BaseBdev2", 00:16:50.488 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:50.488 "is_configured": true, 00:16:50.488 "data_offset": 0, 00:16:50.488 "data_size": 65536 00:16:50.488 }, 00:16:50.488 { 00:16:50.488 "name": "BaseBdev3", 00:16:50.488 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:50.488 "is_configured": true, 00:16:50.488 "data_offset": 0, 00:16:50.488 "data_size": 65536 00:16:50.488 }, 00:16:50.488 { 00:16:50.488 "name": "BaseBdev4", 00:16:50.488 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:50.488 "is_configured": true, 00:16:50.488 "data_offset": 0, 00:16:50.488 "data_size": 65536 00:16:50.488 } 00:16:50.488 ] 00:16:50.488 }' 00:16:50.488 15:55:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:50.488 15:55:55 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:16:51.053 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:16:51.311 [2024-06-10 15:55:56.651322] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:16:51.311 "name": "Existed_Raid", 00:16:51.311 "aliases": [ 00:16:51.311 "0072864b-27d7-43e4-90b0-f04d4a28879b" 00:16:51.311 ], 00:16:51.311 "product_name": "Raid Volume", 00:16:51.311 "block_size": 512, 00:16:51.311 "num_blocks": 262144, 00:16:51.311 "uuid": "0072864b-27d7-43e4-90b0-f04d4a28879b", 00:16:51.311 "assigned_rate_limits": { 00:16:51.311 "rw_ios_per_sec": 0, 00:16:51.311 "rw_mbytes_per_sec": 0, 00:16:51.311 "r_mbytes_per_sec": 0, 00:16:51.311 "w_mbytes_per_sec": 0 00:16:51.311 }, 00:16:51.311 "claimed": false, 00:16:51.311 "zoned": false, 00:16:51.311 "supported_io_types": { 00:16:51.311 "read": true, 00:16:51.311 "write": true, 00:16:51.311 "unmap": true, 00:16:51.311 "write_zeroes": true, 00:16:51.311 "flush": true, 00:16:51.311 "reset": true, 00:16:51.311 "compare": false, 00:16:51.311 "compare_and_write": false, 00:16:51.311 "abort": false, 00:16:51.311 "nvme_admin": false, 00:16:51.311 "nvme_io": false 00:16:51.311 }, 00:16:51.311 "memory_domains": [ 00:16:51.311 { 00:16:51.311 "dma_device_id": "system", 00:16:51.311 "dma_device_type": 1 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.311 "dma_device_type": 2 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "system", 00:16:51.311 "dma_device_type": 1 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.311 "dma_device_type": 2 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "system", 00:16:51.311 "dma_device_type": 1 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.311 "dma_device_type": 2 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "system", 00:16:51.311 "dma_device_type": 1 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.311 "dma_device_type": 2 00:16:51.311 } 00:16:51.311 ], 00:16:51.311 "driver_specific": { 00:16:51.311 "raid": { 00:16:51.311 "uuid": "0072864b-27d7-43e4-90b0-f04d4a28879b", 00:16:51.311 "strip_size_kb": 64, 00:16:51.311 "state": "online", 00:16:51.311 "raid_level": "raid0", 00:16:51.311 "superblock": false, 00:16:51.311 "num_base_bdevs": 4, 00:16:51.311 "num_base_bdevs_discovered": 4, 00:16:51.311 "num_base_bdevs_operational": 4, 00:16:51.311 "base_bdevs_list": [ 00:16:51.311 { 00:16:51.311 "name": "NewBaseBdev", 00:16:51.311 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:51.311 "is_configured": true, 00:16:51.311 "data_offset": 0, 00:16:51.311 "data_size": 65536 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "name": "BaseBdev2", 00:16:51.311 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:51.311 "is_configured": true, 00:16:51.311 "data_offset": 0, 00:16:51.311 "data_size": 65536 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "name": "BaseBdev3", 00:16:51.311 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:51.311 "is_configured": true, 00:16:51.311 "data_offset": 0, 00:16:51.311 "data_size": 65536 00:16:51.311 }, 00:16:51.311 { 00:16:51.311 "name": "BaseBdev4", 00:16:51.311 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:51.311 "is_configured": true, 00:16:51.311 "data_offset": 0, 00:16:51.311 "data_size": 65536 00:16:51.311 } 00:16:51.311 ] 00:16:51.311 } 00:16:51.311 } 00:16:51.311 }' 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:16:51.311 BaseBdev2 00:16:51.311 BaseBdev3 00:16:51.311 BaseBdev4' 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:16:51.311 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:51.568 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:51.568 "name": "NewBaseBdev", 00:16:51.568 "aliases": [ 00:16:51.569 "7f9bc28c-e4ac-4434-9456-cf898ec622f9" 00:16:51.569 ], 00:16:51.569 "product_name": "Malloc disk", 00:16:51.569 "block_size": 512, 00:16:51.569 "num_blocks": 65536, 00:16:51.569 "uuid": "7f9bc28c-e4ac-4434-9456-cf898ec622f9", 00:16:51.569 "assigned_rate_limits": { 00:16:51.569 "rw_ios_per_sec": 0, 00:16:51.569 "rw_mbytes_per_sec": 0, 00:16:51.569 "r_mbytes_per_sec": 0, 00:16:51.569 "w_mbytes_per_sec": 0 00:16:51.569 }, 00:16:51.569 "claimed": true, 00:16:51.569 "claim_type": "exclusive_write", 00:16:51.569 "zoned": false, 00:16:51.569 "supported_io_types": { 00:16:51.569 "read": true, 00:16:51.569 "write": true, 00:16:51.569 "unmap": true, 00:16:51.569 "write_zeroes": true, 00:16:51.569 "flush": true, 00:16:51.569 "reset": true, 00:16:51.569 "compare": false, 00:16:51.569 "compare_and_write": false, 00:16:51.569 "abort": true, 00:16:51.569 "nvme_admin": false, 00:16:51.569 "nvme_io": false 00:16:51.569 }, 00:16:51.569 "memory_domains": [ 00:16:51.569 { 00:16:51.569 "dma_device_id": "system", 00:16:51.569 "dma_device_type": 1 00:16:51.569 }, 00:16:51.569 { 00:16:51.569 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:51.569 "dma_device_type": 2 00:16:51.569 } 00:16:51.569 ], 00:16:51.569 "driver_specific": {} 00:16:51.569 }' 00:16:51.569 15:55:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.569 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:51.569 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:51.569 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:51.826 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.084 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:16:52.084 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.342 "name": "BaseBdev2", 00:16:52.342 "aliases": [ 00:16:52.342 "2c1a39ad-57f5-442f-b91e-5301ca4c226e" 00:16:52.342 ], 00:16:52.342 "product_name": "Malloc disk", 00:16:52.342 "block_size": 512, 00:16:52.342 "num_blocks": 65536, 00:16:52.342 "uuid": "2c1a39ad-57f5-442f-b91e-5301ca4c226e", 00:16:52.342 "assigned_rate_limits": { 00:16:52.342 "rw_ios_per_sec": 0, 00:16:52.342 "rw_mbytes_per_sec": 0, 00:16:52.342 "r_mbytes_per_sec": 0, 00:16:52.342 "w_mbytes_per_sec": 0 00:16:52.342 }, 00:16:52.342 "claimed": true, 00:16:52.342 "claim_type": "exclusive_write", 00:16:52.342 "zoned": false, 00:16:52.342 "supported_io_types": { 00:16:52.342 "read": true, 00:16:52.342 "write": true, 00:16:52.342 "unmap": true, 00:16:52.342 "write_zeroes": true, 00:16:52.342 "flush": true, 00:16:52.342 "reset": true, 00:16:52.342 "compare": false, 00:16:52.342 "compare_and_write": false, 00:16:52.342 "abort": true, 00:16:52.342 "nvme_admin": false, 00:16:52.342 "nvme_io": false 00:16:52.342 }, 00:16:52.342 "memory_domains": [ 00:16:52.342 { 00:16:52.342 "dma_device_id": "system", 00:16:52.342 "dma_device_type": 1 00:16:52.342 }, 00:16:52.342 { 00:16:52.342 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.342 "dma_device_type": 2 00:16:52.342 } 00:16:52.342 ], 00:16:52.342 "driver_specific": {} 00:16:52.342 }' 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.342 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:16:52.601 15:55:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:52.859 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:52.859 "name": "BaseBdev3", 00:16:52.859 "aliases": [ 00:16:52.859 "ce47a01d-9ce3-4354-949f-c7b2f827c59f" 00:16:52.859 ], 00:16:52.859 "product_name": "Malloc disk", 00:16:52.859 "block_size": 512, 00:16:52.859 "num_blocks": 65536, 00:16:52.859 "uuid": "ce47a01d-9ce3-4354-949f-c7b2f827c59f", 00:16:52.859 "assigned_rate_limits": { 00:16:52.859 "rw_ios_per_sec": 0, 00:16:52.859 "rw_mbytes_per_sec": 0, 00:16:52.859 "r_mbytes_per_sec": 0, 00:16:52.859 "w_mbytes_per_sec": 0 00:16:52.859 }, 00:16:52.859 "claimed": true, 00:16:52.859 "claim_type": "exclusive_write", 00:16:52.859 "zoned": false, 00:16:52.859 "supported_io_types": { 00:16:52.859 "read": true, 00:16:52.859 "write": true, 00:16:52.859 "unmap": true, 00:16:52.859 "write_zeroes": true, 00:16:52.859 "flush": true, 00:16:52.859 "reset": true, 00:16:52.859 "compare": false, 00:16:52.859 "compare_and_write": false, 00:16:52.859 "abort": true, 00:16:52.859 "nvme_admin": false, 00:16:52.859 "nvme_io": false 00:16:52.859 }, 00:16:52.859 "memory_domains": [ 00:16:52.860 { 00:16:52.860 "dma_device_id": "system", 00:16:52.860 "dma_device_type": 1 00:16:52.860 }, 00:16:52.860 { 00:16:52.860 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:52.860 "dma_device_type": 2 00:16:52.860 } 00:16:52.860 ], 00:16:52.860 "driver_specific": {} 00:16:52.860 }' 00:16:52.860 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.860 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:52.860 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:52.860 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:52.860 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:16:53.118 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:16:53.376 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:16:53.376 "name": "BaseBdev4", 00:16:53.376 "aliases": [ 00:16:53.377 "268bb22b-db16-4e05-8618-9ca9676444ed" 00:16:53.377 ], 00:16:53.377 "product_name": "Malloc disk", 00:16:53.377 "block_size": 512, 00:16:53.377 "num_blocks": 65536, 00:16:53.377 "uuid": "268bb22b-db16-4e05-8618-9ca9676444ed", 00:16:53.377 "assigned_rate_limits": { 00:16:53.377 "rw_ios_per_sec": 0, 00:16:53.377 "rw_mbytes_per_sec": 0, 00:16:53.377 "r_mbytes_per_sec": 0, 00:16:53.377 "w_mbytes_per_sec": 0 00:16:53.377 }, 00:16:53.377 "claimed": true, 00:16:53.377 "claim_type": "exclusive_write", 00:16:53.377 "zoned": false, 00:16:53.377 "supported_io_types": { 00:16:53.377 "read": true, 00:16:53.377 "write": true, 00:16:53.377 "unmap": true, 00:16:53.377 "write_zeroes": true, 00:16:53.377 "flush": true, 00:16:53.377 "reset": true, 00:16:53.377 "compare": false, 00:16:53.377 "compare_and_write": false, 00:16:53.377 "abort": true, 00:16:53.377 "nvme_admin": false, 00:16:53.377 "nvme_io": false 00:16:53.377 }, 00:16:53.377 "memory_domains": [ 00:16:53.377 { 00:16:53.377 "dma_device_id": "system", 00:16:53.377 "dma_device_type": 1 00:16:53.377 }, 00:16:53.377 { 00:16:53.377 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:53.377 "dma_device_type": 2 00:16:53.377 } 00:16:53.377 ], 00:16:53.377 "driver_specific": {} 00:16:53.377 }' 00:16:53.377 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.635 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:16:53.635 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:16:53.635 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.635 15:55:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:16:53.635 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:16:53.635 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.635 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:16:53.635 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:16:53.635 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.893 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:16:53.893 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:16:53.893 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:54.151 [2024-06-10 15:55:59.458570] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:54.151 [2024-06-10 15:55:59.458594] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:16:54.151 [2024-06-10 15:55:59.458644] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:16:54.151 [2024-06-10 15:55:59.458705] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:16:54.151 [2024-06-10 15:55:59.458720] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1454dd0 name Existed_Raid, state offline 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2706553 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2706553 ']' 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2706553 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2706553 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2706553' 00:16:54.151 killing process with pid 2706553 00:16:54.151 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2706553 00:16:54.152 [2024-06-10 15:55:59.523163] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:16:54.152 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2706553 00:16:54.152 [2024-06-10 15:55:59.557235] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:16:54.410 15:55:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:16:54.410 00:16:54.410 real 0m32.709s 00:16:54.410 user 1m1.376s 00:16:54.410 sys 0m4.547s 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:16:54.411 ************************************ 00:16:54.411 END TEST raid_state_function_test 00:16:54.411 ************************************ 00:16:54.411 15:55:59 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid0 4 true 00:16:54.411 15:55:59 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:16:54.411 15:55:59 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:16:54.411 15:55:59 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:16:54.411 ************************************ 00:16:54.411 START TEST raid_state_function_test_sb 00:16:54.411 ************************************ 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid0 4 true 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid0 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid0 '!=' raid1 ']' 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2712558 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2712558' 00:16:54.411 Process raid pid: 2712558 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2712558 /var/tmp/spdk-raid.sock 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2712558 ']' 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:16:54.411 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:16:54.411 15:55:59 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:54.411 [2024-06-10 15:55:59.885189] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:16:54.411 [2024-06-10 15:55:59.885242] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:16:54.670 [2024-06-10 15:55:59.985836] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:16:54.670 [2024-06-10 15:56:00.089661] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:16:54.670 [2024-06-10 15:56:00.149143] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:54.670 [2024-06-10 15:56:00.149173] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:16:55.605 15:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:16:55.606 15:56:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:16:55.606 15:56:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:55.606 [2024-06-10 15:56:01.073050] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:55.606 [2024-06-10 15:56:01.073090] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:55.606 [2024-06-10 15:56:01.073103] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:55.606 [2024-06-10 15:56:01.073112] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:55.606 [2024-06-10 15:56:01.073120] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:55.606 [2024-06-10 15:56:01.073128] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:55.606 [2024-06-10 15:56:01.073135] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:55.606 [2024-06-10 15:56:01.073143] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:55.606 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:55.864 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:55.864 "name": "Existed_Raid", 00:16:55.864 "uuid": "1d82c227-ad30-4bdb-87bc-0bf728585d05", 00:16:55.864 "strip_size_kb": 64, 00:16:55.864 "state": "configuring", 00:16:55.864 "raid_level": "raid0", 00:16:55.864 "superblock": true, 00:16:55.864 "num_base_bdevs": 4, 00:16:55.864 "num_base_bdevs_discovered": 0, 00:16:55.864 "num_base_bdevs_operational": 4, 00:16:55.864 "base_bdevs_list": [ 00:16:55.864 { 00:16:55.864 "name": "BaseBdev1", 00:16:55.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.864 "is_configured": false, 00:16:55.864 "data_offset": 0, 00:16:55.864 "data_size": 0 00:16:55.864 }, 00:16:55.864 { 00:16:55.864 "name": "BaseBdev2", 00:16:55.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.864 "is_configured": false, 00:16:55.864 "data_offset": 0, 00:16:55.864 "data_size": 0 00:16:55.864 }, 00:16:55.864 { 00:16:55.864 "name": "BaseBdev3", 00:16:55.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.864 "is_configured": false, 00:16:55.864 "data_offset": 0, 00:16:55.864 "data_size": 0 00:16:55.864 }, 00:16:55.864 { 00:16:55.864 "name": "BaseBdev4", 00:16:55.864 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:55.864 "is_configured": false, 00:16:55.864 "data_offset": 0, 00:16:55.864 "data_size": 0 00:16:55.864 } 00:16:55.864 ] 00:16:55.864 }' 00:16:55.864 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:55.864 15:56:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:56.813 15:56:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:56.813 [2024-06-10 15:56:02.199887] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:56.813 [2024-06-10 15:56:02.199914] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x863140 name Existed_Raid, state configuring 00:16:56.813 15:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:57.076 [2024-06-10 15:56:02.456596] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:16:57.076 [2024-06-10 15:56:02.456627] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:16:57.076 [2024-06-10 15:56:02.456635] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:57.076 [2024-06-10 15:56:02.456643] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:57.076 [2024-06-10 15:56:02.456650] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:57.076 [2024-06-10 15:56:02.456658] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:57.076 [2024-06-10 15:56:02.456665] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:57.076 [2024-06-10 15:56:02.456673] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:57.076 15:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:16:57.335 [2024-06-10 15:56:02.718753] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:57.335 BaseBdev1 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:16:57.335 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:16:57.593 15:56:02 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:16:57.852 [ 00:16:57.852 { 00:16:57.852 "name": "BaseBdev1", 00:16:57.852 "aliases": [ 00:16:57.852 "d32203b9-b9de-4a6b-93dc-44d275d87e4f" 00:16:57.852 ], 00:16:57.852 "product_name": "Malloc disk", 00:16:57.852 "block_size": 512, 00:16:57.852 "num_blocks": 65536, 00:16:57.852 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:16:57.852 "assigned_rate_limits": { 00:16:57.852 "rw_ios_per_sec": 0, 00:16:57.852 "rw_mbytes_per_sec": 0, 00:16:57.852 "r_mbytes_per_sec": 0, 00:16:57.852 "w_mbytes_per_sec": 0 00:16:57.852 }, 00:16:57.852 "claimed": true, 00:16:57.852 "claim_type": "exclusive_write", 00:16:57.852 "zoned": false, 00:16:57.852 "supported_io_types": { 00:16:57.852 "read": true, 00:16:57.852 "write": true, 00:16:57.852 "unmap": true, 00:16:57.852 "write_zeroes": true, 00:16:57.852 "flush": true, 00:16:57.852 "reset": true, 00:16:57.852 "compare": false, 00:16:57.852 "compare_and_write": false, 00:16:57.852 "abort": true, 00:16:57.852 "nvme_admin": false, 00:16:57.852 "nvme_io": false 00:16:57.852 }, 00:16:57.852 "memory_domains": [ 00:16:57.852 { 00:16:57.852 "dma_device_id": "system", 00:16:57.852 "dma_device_type": 1 00:16:57.852 }, 00:16:57.852 { 00:16:57.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:16:57.852 "dma_device_type": 2 00:16:57.852 } 00:16:57.852 ], 00:16:57.852 "driver_specific": {} 00:16:57.852 } 00:16:57.852 ] 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:57.852 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:58.110 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:58.110 "name": "Existed_Raid", 00:16:58.110 "uuid": "8b03f81b-932d-47be-b120-c7def8b709cb", 00:16:58.110 "strip_size_kb": 64, 00:16:58.110 "state": "configuring", 00:16:58.110 "raid_level": "raid0", 00:16:58.110 "superblock": true, 00:16:58.110 "num_base_bdevs": 4, 00:16:58.110 "num_base_bdevs_discovered": 1, 00:16:58.110 "num_base_bdevs_operational": 4, 00:16:58.110 "base_bdevs_list": [ 00:16:58.110 { 00:16:58.110 "name": "BaseBdev1", 00:16:58.111 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:16:58.111 "is_configured": true, 00:16:58.111 "data_offset": 2048, 00:16:58.111 "data_size": 63488 00:16:58.111 }, 00:16:58.111 { 00:16:58.111 "name": "BaseBdev2", 00:16:58.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.111 "is_configured": false, 00:16:58.111 "data_offset": 0, 00:16:58.111 "data_size": 0 00:16:58.111 }, 00:16:58.111 { 00:16:58.111 "name": "BaseBdev3", 00:16:58.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.111 "is_configured": false, 00:16:58.111 "data_offset": 0, 00:16:58.111 "data_size": 0 00:16:58.111 }, 00:16:58.111 { 00:16:58.111 "name": "BaseBdev4", 00:16:58.111 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:58.111 "is_configured": false, 00:16:58.111 "data_offset": 0, 00:16:58.111 "data_size": 0 00:16:58.111 } 00:16:58.111 ] 00:16:58.111 }' 00:16:58.111 15:56:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:58.111 15:56:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:16:58.677 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:16:58.936 [2024-06-10 15:56:04.359150] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:16:58.936 [2024-06-10 15:56:04.359187] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8629b0 name Existed_Raid, state configuring 00:16:58.936 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:16:59.194 [2024-06-10 15:56:04.611869] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:16:59.194 [2024-06-10 15:56:04.613388] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:16:59.194 [2024-06-10 15:56:04.613419] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:16:59.194 [2024-06-10 15:56:04.613428] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:16:59.194 [2024-06-10 15:56:04.613436] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:16:59.194 [2024-06-10 15:56:04.613443] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:16:59.194 [2024-06-10 15:56:04.613452] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:16:59.194 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:16:59.453 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:16:59.453 "name": "Existed_Raid", 00:16:59.453 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:16:59.453 "strip_size_kb": 64, 00:16:59.453 "state": "configuring", 00:16:59.453 "raid_level": "raid0", 00:16:59.453 "superblock": true, 00:16:59.454 "num_base_bdevs": 4, 00:16:59.454 "num_base_bdevs_discovered": 1, 00:16:59.454 "num_base_bdevs_operational": 4, 00:16:59.454 "base_bdevs_list": [ 00:16:59.454 { 00:16:59.454 "name": "BaseBdev1", 00:16:59.454 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:16:59.454 "is_configured": true, 00:16:59.454 "data_offset": 2048, 00:16:59.454 "data_size": 63488 00:16:59.454 }, 00:16:59.454 { 00:16:59.454 "name": "BaseBdev2", 00:16:59.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.454 "is_configured": false, 00:16:59.454 "data_offset": 0, 00:16:59.454 "data_size": 0 00:16:59.454 }, 00:16:59.454 { 00:16:59.454 "name": "BaseBdev3", 00:16:59.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.454 "is_configured": false, 00:16:59.454 "data_offset": 0, 00:16:59.454 "data_size": 0 00:16:59.454 }, 00:16:59.454 { 00:16:59.454 "name": "BaseBdev4", 00:16:59.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:16:59.454 "is_configured": false, 00:16:59.454 "data_offset": 0, 00:16:59.454 "data_size": 0 00:16:59.454 } 00:16:59.454 ] 00:16:59.454 }' 00:16:59.454 15:56:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:16:59.454 15:56:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:00.021 15:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:00.280 [2024-06-10 15:56:05.766108] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:00.280 BaseBdev2 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:00.280 15:56:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:00.538 15:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:00.797 [ 00:17:00.797 { 00:17:00.797 "name": "BaseBdev2", 00:17:00.797 "aliases": [ 00:17:00.797 "90d3007f-7b3d-4d17-b29c-b05212ca1ba3" 00:17:00.797 ], 00:17:00.797 "product_name": "Malloc disk", 00:17:00.797 "block_size": 512, 00:17:00.797 "num_blocks": 65536, 00:17:00.797 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:00.797 "assigned_rate_limits": { 00:17:00.797 "rw_ios_per_sec": 0, 00:17:00.797 "rw_mbytes_per_sec": 0, 00:17:00.797 "r_mbytes_per_sec": 0, 00:17:00.797 "w_mbytes_per_sec": 0 00:17:00.797 }, 00:17:00.797 "claimed": true, 00:17:00.797 "claim_type": "exclusive_write", 00:17:00.797 "zoned": false, 00:17:00.797 "supported_io_types": { 00:17:00.797 "read": true, 00:17:00.797 "write": true, 00:17:00.797 "unmap": true, 00:17:00.797 "write_zeroes": true, 00:17:00.797 "flush": true, 00:17:00.797 "reset": true, 00:17:00.797 "compare": false, 00:17:00.797 "compare_and_write": false, 00:17:00.797 "abort": true, 00:17:00.797 "nvme_admin": false, 00:17:00.797 "nvme_io": false 00:17:00.797 }, 00:17:00.797 "memory_domains": [ 00:17:00.797 { 00:17:00.797 "dma_device_id": "system", 00:17:00.797 "dma_device_type": 1 00:17:00.797 }, 00:17:00.797 { 00:17:00.797 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:00.797 "dma_device_type": 2 00:17:00.797 } 00:17:00.797 ], 00:17:00.797 "driver_specific": {} 00:17:00.797 } 00:17:00.797 ] 00:17:01.056 15:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:01.056 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:01.056 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:01.056 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:01.057 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:01.316 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:01.316 "name": "Existed_Raid", 00:17:01.316 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:01.316 "strip_size_kb": 64, 00:17:01.316 "state": "configuring", 00:17:01.316 "raid_level": "raid0", 00:17:01.316 "superblock": true, 00:17:01.316 "num_base_bdevs": 4, 00:17:01.316 "num_base_bdevs_discovered": 2, 00:17:01.316 "num_base_bdevs_operational": 4, 00:17:01.316 "base_bdevs_list": [ 00:17:01.316 { 00:17:01.316 "name": "BaseBdev1", 00:17:01.316 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:17:01.316 "is_configured": true, 00:17:01.316 "data_offset": 2048, 00:17:01.316 "data_size": 63488 00:17:01.316 }, 00:17:01.316 { 00:17:01.316 "name": "BaseBdev2", 00:17:01.316 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:01.316 "is_configured": true, 00:17:01.316 "data_offset": 2048, 00:17:01.316 "data_size": 63488 00:17:01.316 }, 00:17:01.316 { 00:17:01.316 "name": "BaseBdev3", 00:17:01.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.316 "is_configured": false, 00:17:01.316 "data_offset": 0, 00:17:01.316 "data_size": 0 00:17:01.316 }, 00:17:01.316 { 00:17:01.316 "name": "BaseBdev4", 00:17:01.316 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:01.316 "is_configured": false, 00:17:01.316 "data_offset": 0, 00:17:01.316 "data_size": 0 00:17:01.316 } 00:17:01.316 ] 00:17:01.316 }' 00:17:01.316 15:56:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:01.316 15:56:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:01.882 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:02.141 [2024-06-10 15:56:07.409766] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:02.142 BaseBdev3 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:02.142 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:02.459 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:02.459 [ 00:17:02.459 { 00:17:02.459 "name": "BaseBdev3", 00:17:02.459 "aliases": [ 00:17:02.459 "ad6de44d-2079-43bb-b30e-d24d73563fa0" 00:17:02.459 ], 00:17:02.459 "product_name": "Malloc disk", 00:17:02.459 "block_size": 512, 00:17:02.459 "num_blocks": 65536, 00:17:02.459 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:02.459 "assigned_rate_limits": { 00:17:02.459 "rw_ios_per_sec": 0, 00:17:02.459 "rw_mbytes_per_sec": 0, 00:17:02.459 "r_mbytes_per_sec": 0, 00:17:02.459 "w_mbytes_per_sec": 0 00:17:02.459 }, 00:17:02.459 "claimed": true, 00:17:02.459 "claim_type": "exclusive_write", 00:17:02.459 "zoned": false, 00:17:02.459 "supported_io_types": { 00:17:02.459 "read": true, 00:17:02.459 "write": true, 00:17:02.459 "unmap": true, 00:17:02.459 "write_zeroes": true, 00:17:02.459 "flush": true, 00:17:02.459 "reset": true, 00:17:02.459 "compare": false, 00:17:02.459 "compare_and_write": false, 00:17:02.459 "abort": true, 00:17:02.459 "nvme_admin": false, 00:17:02.459 "nvme_io": false 00:17:02.459 }, 00:17:02.459 "memory_domains": [ 00:17:02.459 { 00:17:02.459 "dma_device_id": "system", 00:17:02.459 "dma_device_type": 1 00:17:02.459 }, 00:17:02.459 { 00:17:02.459 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:02.459 "dma_device_type": 2 00:17:02.459 } 00:17:02.459 ], 00:17:02.459 "driver_specific": {} 00:17:02.459 } 00:17:02.459 ] 00:17:02.459 15:56:07 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:02.459 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:02.459 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:02.460 15:56:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:02.727 15:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:02.727 "name": "Existed_Raid", 00:17:02.727 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:02.727 "strip_size_kb": 64, 00:17:02.727 "state": "configuring", 00:17:02.727 "raid_level": "raid0", 00:17:02.727 "superblock": true, 00:17:02.727 "num_base_bdevs": 4, 00:17:02.727 "num_base_bdevs_discovered": 3, 00:17:02.727 "num_base_bdevs_operational": 4, 00:17:02.727 "base_bdevs_list": [ 00:17:02.727 { 00:17:02.728 "name": "BaseBdev1", 00:17:02.728 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:17:02.728 "is_configured": true, 00:17:02.728 "data_offset": 2048, 00:17:02.728 "data_size": 63488 00:17:02.728 }, 00:17:02.728 { 00:17:02.728 "name": "BaseBdev2", 00:17:02.728 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:02.728 "is_configured": true, 00:17:02.728 "data_offset": 2048, 00:17:02.728 "data_size": 63488 00:17:02.728 }, 00:17:02.728 { 00:17:02.728 "name": "BaseBdev3", 00:17:02.728 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:02.728 "is_configured": true, 00:17:02.728 "data_offset": 2048, 00:17:02.728 "data_size": 63488 00:17:02.728 }, 00:17:02.728 { 00:17:02.728 "name": "BaseBdev4", 00:17:02.728 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:02.728 "is_configured": false, 00:17:02.728 "data_offset": 0, 00:17:02.728 "data_size": 0 00:17:02.728 } 00:17:02.728 ] 00:17:02.728 }' 00:17:02.728 15:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:02.728 15:56:08 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:03.663 15:56:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:03.663 [2024-06-10 15:56:09.053351] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:03.663 [2024-06-10 15:56:09.053510] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x863ac0 00:17:03.663 [2024-06-10 15:56:09.053523] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:03.663 [2024-06-10 15:56:09.053708] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa08ba0 00:17:03.663 [2024-06-10 15:56:09.053836] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x863ac0 00:17:03.663 [2024-06-10 15:56:09.053845] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x863ac0 00:17:03.663 [2024-06-10 15:56:09.053938] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:03.663 BaseBdev4 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:03.663 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:03.922 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:04.180 [ 00:17:04.180 { 00:17:04.180 "name": "BaseBdev4", 00:17:04.180 "aliases": [ 00:17:04.180 "fda7d920-3bae-46e1-bac8-0acdf0ec4b35" 00:17:04.180 ], 00:17:04.180 "product_name": "Malloc disk", 00:17:04.180 "block_size": 512, 00:17:04.180 "num_blocks": 65536, 00:17:04.181 "uuid": "fda7d920-3bae-46e1-bac8-0acdf0ec4b35", 00:17:04.181 "assigned_rate_limits": { 00:17:04.181 "rw_ios_per_sec": 0, 00:17:04.181 "rw_mbytes_per_sec": 0, 00:17:04.181 "r_mbytes_per_sec": 0, 00:17:04.181 "w_mbytes_per_sec": 0 00:17:04.181 }, 00:17:04.181 "claimed": true, 00:17:04.181 "claim_type": "exclusive_write", 00:17:04.181 "zoned": false, 00:17:04.181 "supported_io_types": { 00:17:04.181 "read": true, 00:17:04.181 "write": true, 00:17:04.181 "unmap": true, 00:17:04.181 "write_zeroes": true, 00:17:04.181 "flush": true, 00:17:04.181 "reset": true, 00:17:04.181 "compare": false, 00:17:04.181 "compare_and_write": false, 00:17:04.181 "abort": true, 00:17:04.181 "nvme_admin": false, 00:17:04.181 "nvme_io": false 00:17:04.181 }, 00:17:04.181 "memory_domains": [ 00:17:04.181 { 00:17:04.181 "dma_device_id": "system", 00:17:04.181 "dma_device_type": 1 00:17:04.181 }, 00:17:04.181 { 00:17:04.181 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:04.181 "dma_device_type": 2 00:17:04.181 } 00:17:04.181 ], 00:17:04.181 "driver_specific": {} 00:17:04.181 } 00:17:04.181 ] 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:04.181 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:04.439 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:04.439 "name": "Existed_Raid", 00:17:04.439 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:04.439 "strip_size_kb": 64, 00:17:04.439 "state": "online", 00:17:04.439 "raid_level": "raid0", 00:17:04.439 "superblock": true, 00:17:04.439 "num_base_bdevs": 4, 00:17:04.439 "num_base_bdevs_discovered": 4, 00:17:04.439 "num_base_bdevs_operational": 4, 00:17:04.439 "base_bdevs_list": [ 00:17:04.439 { 00:17:04.439 "name": "BaseBdev1", 00:17:04.439 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:17:04.439 "is_configured": true, 00:17:04.439 "data_offset": 2048, 00:17:04.439 "data_size": 63488 00:17:04.439 }, 00:17:04.439 { 00:17:04.439 "name": "BaseBdev2", 00:17:04.439 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:04.439 "is_configured": true, 00:17:04.439 "data_offset": 2048, 00:17:04.439 "data_size": 63488 00:17:04.439 }, 00:17:04.439 { 00:17:04.439 "name": "BaseBdev3", 00:17:04.439 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:04.439 "is_configured": true, 00:17:04.439 "data_offset": 2048, 00:17:04.439 "data_size": 63488 00:17:04.439 }, 00:17:04.439 { 00:17:04.439 "name": "BaseBdev4", 00:17:04.439 "uuid": "fda7d920-3bae-46e1-bac8-0acdf0ec4b35", 00:17:04.439 "is_configured": true, 00:17:04.439 "data_offset": 2048, 00:17:04.439 "data_size": 63488 00:17:04.439 } 00:17:04.439 ] 00:17:04.439 }' 00:17:04.439 15:56:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:04.439 15:56:09 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:05.006 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:05.263 [2024-06-10 15:56:10.698102] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:05.264 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:05.264 "name": "Existed_Raid", 00:17:05.264 "aliases": [ 00:17:05.264 "a208290a-4949-445d-87e0-ea101a16fb15" 00:17:05.264 ], 00:17:05.264 "product_name": "Raid Volume", 00:17:05.264 "block_size": 512, 00:17:05.264 "num_blocks": 253952, 00:17:05.264 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:05.264 "assigned_rate_limits": { 00:17:05.264 "rw_ios_per_sec": 0, 00:17:05.264 "rw_mbytes_per_sec": 0, 00:17:05.264 "r_mbytes_per_sec": 0, 00:17:05.264 "w_mbytes_per_sec": 0 00:17:05.264 }, 00:17:05.264 "claimed": false, 00:17:05.264 "zoned": false, 00:17:05.264 "supported_io_types": { 00:17:05.264 "read": true, 00:17:05.264 "write": true, 00:17:05.264 "unmap": true, 00:17:05.264 "write_zeroes": true, 00:17:05.264 "flush": true, 00:17:05.264 "reset": true, 00:17:05.264 "compare": false, 00:17:05.264 "compare_and_write": false, 00:17:05.264 "abort": false, 00:17:05.264 "nvme_admin": false, 00:17:05.264 "nvme_io": false 00:17:05.264 }, 00:17:05.264 "memory_domains": [ 00:17:05.264 { 00:17:05.264 "dma_device_id": "system", 00:17:05.264 "dma_device_type": 1 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.264 "dma_device_type": 2 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "system", 00:17:05.264 "dma_device_type": 1 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.264 "dma_device_type": 2 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "system", 00:17:05.264 "dma_device_type": 1 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.264 "dma_device_type": 2 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "system", 00:17:05.264 "dma_device_type": 1 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.264 "dma_device_type": 2 00:17:05.264 } 00:17:05.264 ], 00:17:05.264 "driver_specific": { 00:17:05.264 "raid": { 00:17:05.264 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:05.264 "strip_size_kb": 64, 00:17:05.264 "state": "online", 00:17:05.264 "raid_level": "raid0", 00:17:05.264 "superblock": true, 00:17:05.264 "num_base_bdevs": 4, 00:17:05.264 "num_base_bdevs_discovered": 4, 00:17:05.264 "num_base_bdevs_operational": 4, 00:17:05.264 "base_bdevs_list": [ 00:17:05.264 { 00:17:05.264 "name": "BaseBdev1", 00:17:05.264 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:17:05.264 "is_configured": true, 00:17:05.264 "data_offset": 2048, 00:17:05.264 "data_size": 63488 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "name": "BaseBdev2", 00:17:05.264 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:05.264 "is_configured": true, 00:17:05.264 "data_offset": 2048, 00:17:05.264 "data_size": 63488 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "name": "BaseBdev3", 00:17:05.264 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:05.264 "is_configured": true, 00:17:05.264 "data_offset": 2048, 00:17:05.264 "data_size": 63488 00:17:05.264 }, 00:17:05.264 { 00:17:05.264 "name": "BaseBdev4", 00:17:05.264 "uuid": "fda7d920-3bae-46e1-bac8-0acdf0ec4b35", 00:17:05.264 "is_configured": true, 00:17:05.264 "data_offset": 2048, 00:17:05.264 "data_size": 63488 00:17:05.264 } 00:17:05.264 ] 00:17:05.264 } 00:17:05.264 } 00:17:05.264 }' 00:17:05.264 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:05.264 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:17:05.264 BaseBdev2 00:17:05.264 BaseBdev3 00:17:05.264 BaseBdev4' 00:17:05.264 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:05.521 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:17:05.521 15:56:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:05.778 "name": "BaseBdev1", 00:17:05.778 "aliases": [ 00:17:05.778 "d32203b9-b9de-4a6b-93dc-44d275d87e4f" 00:17:05.778 ], 00:17:05.778 "product_name": "Malloc disk", 00:17:05.778 "block_size": 512, 00:17:05.778 "num_blocks": 65536, 00:17:05.778 "uuid": "d32203b9-b9de-4a6b-93dc-44d275d87e4f", 00:17:05.778 "assigned_rate_limits": { 00:17:05.778 "rw_ios_per_sec": 0, 00:17:05.778 "rw_mbytes_per_sec": 0, 00:17:05.778 "r_mbytes_per_sec": 0, 00:17:05.778 "w_mbytes_per_sec": 0 00:17:05.778 }, 00:17:05.778 "claimed": true, 00:17:05.778 "claim_type": "exclusive_write", 00:17:05.778 "zoned": false, 00:17:05.778 "supported_io_types": { 00:17:05.778 "read": true, 00:17:05.778 "write": true, 00:17:05.778 "unmap": true, 00:17:05.778 "write_zeroes": true, 00:17:05.778 "flush": true, 00:17:05.778 "reset": true, 00:17:05.778 "compare": false, 00:17:05.778 "compare_and_write": false, 00:17:05.778 "abort": true, 00:17:05.778 "nvme_admin": false, 00:17:05.778 "nvme_io": false 00:17:05.778 }, 00:17:05.778 "memory_domains": [ 00:17:05.778 { 00:17:05.778 "dma_device_id": "system", 00:17:05.778 "dma_device_type": 1 00:17:05.778 }, 00:17:05.778 { 00:17:05.778 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:05.778 "dma_device_type": 2 00:17:05.778 } 00:17:05.778 ], 00:17:05.778 "driver_specific": {} 00:17:05.778 }' 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:05.778 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.035 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.035 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.035 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.035 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:06.035 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.293 "name": "BaseBdev2", 00:17:06.293 "aliases": [ 00:17:06.293 "90d3007f-7b3d-4d17-b29c-b05212ca1ba3" 00:17:06.293 ], 00:17:06.293 "product_name": "Malloc disk", 00:17:06.293 "block_size": 512, 00:17:06.293 "num_blocks": 65536, 00:17:06.293 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:06.293 "assigned_rate_limits": { 00:17:06.293 "rw_ios_per_sec": 0, 00:17:06.293 "rw_mbytes_per_sec": 0, 00:17:06.293 "r_mbytes_per_sec": 0, 00:17:06.293 "w_mbytes_per_sec": 0 00:17:06.293 }, 00:17:06.293 "claimed": true, 00:17:06.293 "claim_type": "exclusive_write", 00:17:06.293 "zoned": false, 00:17:06.293 "supported_io_types": { 00:17:06.293 "read": true, 00:17:06.293 "write": true, 00:17:06.293 "unmap": true, 00:17:06.293 "write_zeroes": true, 00:17:06.293 "flush": true, 00:17:06.293 "reset": true, 00:17:06.293 "compare": false, 00:17:06.293 "compare_and_write": false, 00:17:06.293 "abort": true, 00:17:06.293 "nvme_admin": false, 00:17:06.293 "nvme_io": false 00:17:06.293 }, 00:17:06.293 "memory_domains": [ 00:17:06.293 { 00:17:06.293 "dma_device_id": "system", 00:17:06.293 "dma_device_type": 1 00:17:06.293 }, 00:17:06.293 { 00:17:06.293 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.293 "dma_device_type": 2 00:17:06.293 } 00:17:06.293 ], 00:17:06.293 "driver_specific": {} 00:17:06.293 }' 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.293 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.550 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.550 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:06.550 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:06.550 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:06.550 15:56:11 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:06.550 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:06.550 "name": "BaseBdev3", 00:17:06.550 "aliases": [ 00:17:06.550 "ad6de44d-2079-43bb-b30e-d24d73563fa0" 00:17:06.550 ], 00:17:06.550 "product_name": "Malloc disk", 00:17:06.550 "block_size": 512, 00:17:06.550 "num_blocks": 65536, 00:17:06.550 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:06.550 "assigned_rate_limits": { 00:17:06.550 "rw_ios_per_sec": 0, 00:17:06.550 "rw_mbytes_per_sec": 0, 00:17:06.550 "r_mbytes_per_sec": 0, 00:17:06.550 "w_mbytes_per_sec": 0 00:17:06.550 }, 00:17:06.550 "claimed": true, 00:17:06.550 "claim_type": "exclusive_write", 00:17:06.550 "zoned": false, 00:17:06.550 "supported_io_types": { 00:17:06.550 "read": true, 00:17:06.550 "write": true, 00:17:06.550 "unmap": true, 00:17:06.550 "write_zeroes": true, 00:17:06.550 "flush": true, 00:17:06.550 "reset": true, 00:17:06.550 "compare": false, 00:17:06.550 "compare_and_write": false, 00:17:06.550 "abort": true, 00:17:06.550 "nvme_admin": false, 00:17:06.550 "nvme_io": false 00:17:06.550 }, 00:17:06.550 "memory_domains": [ 00:17:06.550 { 00:17:06.550 "dma_device_id": "system", 00:17:06.550 "dma_device_type": 1 00:17:06.550 }, 00:17:06.550 { 00:17:06.550 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:06.550 "dma_device_type": 2 00:17:06.550 } 00:17:06.550 ], 00:17:06.550 "driver_specific": {} 00:17:06.550 }' 00:17:06.550 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:06.815 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.074 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.074 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:07.074 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:07.074 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:07.334 "name": "BaseBdev4", 00:17:07.334 "aliases": [ 00:17:07.334 "fda7d920-3bae-46e1-bac8-0acdf0ec4b35" 00:17:07.334 ], 00:17:07.334 "product_name": "Malloc disk", 00:17:07.334 "block_size": 512, 00:17:07.334 "num_blocks": 65536, 00:17:07.334 "uuid": "fda7d920-3bae-46e1-bac8-0acdf0ec4b35", 00:17:07.334 "assigned_rate_limits": { 00:17:07.334 "rw_ios_per_sec": 0, 00:17:07.334 "rw_mbytes_per_sec": 0, 00:17:07.334 "r_mbytes_per_sec": 0, 00:17:07.334 "w_mbytes_per_sec": 0 00:17:07.334 }, 00:17:07.334 "claimed": true, 00:17:07.334 "claim_type": "exclusive_write", 00:17:07.334 "zoned": false, 00:17:07.334 "supported_io_types": { 00:17:07.334 "read": true, 00:17:07.334 "write": true, 00:17:07.334 "unmap": true, 00:17:07.334 "write_zeroes": true, 00:17:07.334 "flush": true, 00:17:07.334 "reset": true, 00:17:07.334 "compare": false, 00:17:07.334 "compare_and_write": false, 00:17:07.334 "abort": true, 00:17:07.334 "nvme_admin": false, 00:17:07.334 "nvme_io": false 00:17:07.334 }, 00:17:07.334 "memory_domains": [ 00:17:07.334 { 00:17:07.334 "dma_device_id": "system", 00:17:07.334 "dma_device_type": 1 00:17:07.334 }, 00:17:07.334 { 00:17:07.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:07.334 "dma_device_type": 2 00:17:07.334 } 00:17:07.334 ], 00:17:07.334 "driver_specific": {} 00:17:07.334 }' 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.334 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:07.593 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:07.593 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.593 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:07.593 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:07.593 15:56:12 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:07.852 [2024-06-10 15:56:13.180557] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:07.852 [2024-06-10 15:56:13.180583] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:07.852 [2024-06-10 15:56:13.180630] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid0 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline raid0 64 3 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:07.852 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:08.110 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:08.110 "name": "Existed_Raid", 00:17:08.110 "uuid": "a208290a-4949-445d-87e0-ea101a16fb15", 00:17:08.110 "strip_size_kb": 64, 00:17:08.110 "state": "offline", 00:17:08.110 "raid_level": "raid0", 00:17:08.110 "superblock": true, 00:17:08.110 "num_base_bdevs": 4, 00:17:08.110 "num_base_bdevs_discovered": 3, 00:17:08.110 "num_base_bdevs_operational": 3, 00:17:08.110 "base_bdevs_list": [ 00:17:08.110 { 00:17:08.110 "name": null, 00:17:08.110 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:08.110 "is_configured": false, 00:17:08.110 "data_offset": 2048, 00:17:08.110 "data_size": 63488 00:17:08.110 }, 00:17:08.110 { 00:17:08.110 "name": "BaseBdev2", 00:17:08.110 "uuid": "90d3007f-7b3d-4d17-b29c-b05212ca1ba3", 00:17:08.111 "is_configured": true, 00:17:08.111 "data_offset": 2048, 00:17:08.111 "data_size": 63488 00:17:08.111 }, 00:17:08.111 { 00:17:08.111 "name": "BaseBdev3", 00:17:08.111 "uuid": "ad6de44d-2079-43bb-b30e-d24d73563fa0", 00:17:08.111 "is_configured": true, 00:17:08.111 "data_offset": 2048, 00:17:08.111 "data_size": 63488 00:17:08.111 }, 00:17:08.111 { 00:17:08.111 "name": "BaseBdev4", 00:17:08.111 "uuid": "fda7d920-3bae-46e1-bac8-0acdf0ec4b35", 00:17:08.111 "is_configured": true, 00:17:08.111 "data_offset": 2048, 00:17:08.111 "data_size": 63488 00:17:08.111 } 00:17:08.111 ] 00:17:08.111 }' 00:17:08.111 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:08.111 15:56:13 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:08.677 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:17:08.677 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:08.677 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.677 15:56:13 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:08.677 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:08.677 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:08.677 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:17:08.934 [2024-06-10 15:56:14.336870] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:08.934 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:08.934 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:08.934 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:08.935 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:09.193 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:09.193 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:09.193 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:17:09.452 [2024-06-10 15:56:14.760412] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:17:09.452 15:56:14 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:17:09.711 [2024-06-10 15:56:15.183888] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:17:09.711 [2024-06-10 15:56:15.183930] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x863ac0 name Existed_Raid, state offline 00:17:09.711 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:17:09.711 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:17:09.711 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:17:09.711 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:09.969 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:17:10.226 BaseBdev2 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:10.226 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:10.484 15:56:15 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:17:10.742 [ 00:17:10.742 { 00:17:10.742 "name": "BaseBdev2", 00:17:10.742 "aliases": [ 00:17:10.742 "19c7cfbc-df3a-4319-b6dd-c07945240625" 00:17:10.742 ], 00:17:10.742 "product_name": "Malloc disk", 00:17:10.742 "block_size": 512, 00:17:10.742 "num_blocks": 65536, 00:17:10.742 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:10.742 "assigned_rate_limits": { 00:17:10.742 "rw_ios_per_sec": 0, 00:17:10.742 "rw_mbytes_per_sec": 0, 00:17:10.742 "r_mbytes_per_sec": 0, 00:17:10.742 "w_mbytes_per_sec": 0 00:17:10.742 }, 00:17:10.742 "claimed": false, 00:17:10.742 "zoned": false, 00:17:10.742 "supported_io_types": { 00:17:10.742 "read": true, 00:17:10.742 "write": true, 00:17:10.742 "unmap": true, 00:17:10.742 "write_zeroes": true, 00:17:10.742 "flush": true, 00:17:10.742 "reset": true, 00:17:10.742 "compare": false, 00:17:10.742 "compare_and_write": false, 00:17:10.742 "abort": true, 00:17:10.742 "nvme_admin": false, 00:17:10.742 "nvme_io": false 00:17:10.742 }, 00:17:10.742 "memory_domains": [ 00:17:10.742 { 00:17:10.742 "dma_device_id": "system", 00:17:10.742 "dma_device_type": 1 00:17:10.742 }, 00:17:10.742 { 00:17:10.742 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:10.742 "dma_device_type": 2 00:17:10.742 } 00:17:10.742 ], 00:17:10.742 "driver_specific": {} 00:17:10.742 } 00:17:10.742 ] 00:17:10.742 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:10.742 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:10.742 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:10.742 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:17:11.001 BaseBdev3 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.001 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:17:11.259 [ 00:17:11.259 { 00:17:11.259 "name": "BaseBdev3", 00:17:11.259 "aliases": [ 00:17:11.259 "20876688-0a04-45c3-ab92-d94a460eef47" 00:17:11.259 ], 00:17:11.259 "product_name": "Malloc disk", 00:17:11.259 "block_size": 512, 00:17:11.259 "num_blocks": 65536, 00:17:11.259 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:11.259 "assigned_rate_limits": { 00:17:11.259 "rw_ios_per_sec": 0, 00:17:11.259 "rw_mbytes_per_sec": 0, 00:17:11.259 "r_mbytes_per_sec": 0, 00:17:11.259 "w_mbytes_per_sec": 0 00:17:11.259 }, 00:17:11.259 "claimed": false, 00:17:11.259 "zoned": false, 00:17:11.259 "supported_io_types": { 00:17:11.259 "read": true, 00:17:11.259 "write": true, 00:17:11.259 "unmap": true, 00:17:11.259 "write_zeroes": true, 00:17:11.259 "flush": true, 00:17:11.259 "reset": true, 00:17:11.259 "compare": false, 00:17:11.259 "compare_and_write": false, 00:17:11.259 "abort": true, 00:17:11.259 "nvme_admin": false, 00:17:11.259 "nvme_io": false 00:17:11.259 }, 00:17:11.259 "memory_domains": [ 00:17:11.259 { 00:17:11.259 "dma_device_id": "system", 00:17:11.259 "dma_device_type": 1 00:17:11.259 }, 00:17:11.259 { 00:17:11.259 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.259 "dma_device_type": 2 00:17:11.259 } 00:17:11.259 ], 00:17:11.259 "driver_specific": {} 00:17:11.259 } 00:17:11.259 ] 00:17:11.259 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:11.259 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:11.259 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:11.259 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:17:11.517 BaseBdev4 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:11.517 15:56:16 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:11.775 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:17:11.775 [ 00:17:11.775 { 00:17:11.775 "name": "BaseBdev4", 00:17:11.775 "aliases": [ 00:17:11.775 "34e6807a-5660-4fb1-8574-c4356c5e97a2" 00:17:11.775 ], 00:17:11.775 "product_name": "Malloc disk", 00:17:11.775 "block_size": 512, 00:17:11.775 "num_blocks": 65536, 00:17:11.775 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:11.775 "assigned_rate_limits": { 00:17:11.775 "rw_ios_per_sec": 0, 00:17:11.775 "rw_mbytes_per_sec": 0, 00:17:11.775 "r_mbytes_per_sec": 0, 00:17:11.775 "w_mbytes_per_sec": 0 00:17:11.775 }, 00:17:11.775 "claimed": false, 00:17:11.775 "zoned": false, 00:17:11.775 "supported_io_types": { 00:17:11.775 "read": true, 00:17:11.775 "write": true, 00:17:11.775 "unmap": true, 00:17:11.775 "write_zeroes": true, 00:17:11.775 "flush": true, 00:17:11.775 "reset": true, 00:17:11.775 "compare": false, 00:17:11.775 "compare_and_write": false, 00:17:11.775 "abort": true, 00:17:11.775 "nvme_admin": false, 00:17:11.775 "nvme_io": false 00:17:11.775 }, 00:17:11.775 "memory_domains": [ 00:17:11.775 { 00:17:11.775 "dma_device_id": "system", 00:17:11.775 "dma_device_type": 1 00:17:11.775 }, 00:17:11.775 { 00:17:11.775 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:11.775 "dma_device_type": 2 00:17:11.775 } 00:17:11.775 ], 00:17:11.775 "driver_specific": {} 00:17:11.775 } 00:17:11.775 ] 00:17:12.033 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:12.034 [2024-06-10 15:56:17.521903] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:12.034 [2024-06-10 15:56:17.521939] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:12.034 [2024-06-10 15:56:17.521965] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:12.034 [2024-06-10 15:56:17.523359] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:12.034 [2024-06-10 15:56:17.523401] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:12.034 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:12.292 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:12.292 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:12.292 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:12.292 "name": "Existed_Raid", 00:17:12.292 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:12.292 "strip_size_kb": 64, 00:17:12.292 "state": "configuring", 00:17:12.292 "raid_level": "raid0", 00:17:12.292 "superblock": true, 00:17:12.292 "num_base_bdevs": 4, 00:17:12.292 "num_base_bdevs_discovered": 3, 00:17:12.292 "num_base_bdevs_operational": 4, 00:17:12.292 "base_bdevs_list": [ 00:17:12.292 { 00:17:12.292 "name": "BaseBdev1", 00:17:12.292 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:12.292 "is_configured": false, 00:17:12.292 "data_offset": 0, 00:17:12.292 "data_size": 0 00:17:12.292 }, 00:17:12.292 { 00:17:12.292 "name": "BaseBdev2", 00:17:12.292 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:12.292 "is_configured": true, 00:17:12.292 "data_offset": 2048, 00:17:12.292 "data_size": 63488 00:17:12.292 }, 00:17:12.292 { 00:17:12.292 "name": "BaseBdev3", 00:17:12.292 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:12.292 "is_configured": true, 00:17:12.292 "data_offset": 2048, 00:17:12.292 "data_size": 63488 00:17:12.292 }, 00:17:12.292 { 00:17:12.292 "name": "BaseBdev4", 00:17:12.292 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:12.292 "is_configured": true, 00:17:12.292 "data_offset": 2048, 00:17:12.292 "data_size": 63488 00:17:12.292 } 00:17:12.292 ] 00:17:12.292 }' 00:17:12.292 15:56:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:12.292 15:56:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:17:13.226 [2024-06-10 15:56:18.620908] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:13.226 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:13.484 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:13.484 "name": "Existed_Raid", 00:17:13.484 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:13.484 "strip_size_kb": 64, 00:17:13.484 "state": "configuring", 00:17:13.484 "raid_level": "raid0", 00:17:13.484 "superblock": true, 00:17:13.484 "num_base_bdevs": 4, 00:17:13.484 "num_base_bdevs_discovered": 2, 00:17:13.484 "num_base_bdevs_operational": 4, 00:17:13.484 "base_bdevs_list": [ 00:17:13.484 { 00:17:13.484 "name": "BaseBdev1", 00:17:13.484 "uuid": "00000000-0000-0000-0000-000000000000", 00:17:13.484 "is_configured": false, 00:17:13.484 "data_offset": 0, 00:17:13.484 "data_size": 0 00:17:13.484 }, 00:17:13.484 { 00:17:13.484 "name": null, 00:17:13.484 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:13.484 "is_configured": false, 00:17:13.484 "data_offset": 2048, 00:17:13.484 "data_size": 63488 00:17:13.484 }, 00:17:13.484 { 00:17:13.484 "name": "BaseBdev3", 00:17:13.484 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:13.484 "is_configured": true, 00:17:13.484 "data_offset": 2048, 00:17:13.484 "data_size": 63488 00:17:13.484 }, 00:17:13.484 { 00:17:13.484 "name": "BaseBdev4", 00:17:13.484 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:13.484 "is_configured": true, 00:17:13.484 "data_offset": 2048, 00:17:13.484 "data_size": 63488 00:17:13.484 } 00:17:13.484 ] 00:17:13.484 }' 00:17:13.484 15:56:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:13.484 15:56:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:14.050 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:14.050 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:14.308 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:17:14.308 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:17:14.566 [2024-06-10 15:56:19.831460] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:14.566 BaseBdev1 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:14.566 15:56:19 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:14.824 15:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:17:15.082 [ 00:17:15.082 { 00:17:15.082 "name": "BaseBdev1", 00:17:15.082 "aliases": [ 00:17:15.082 "b952dc92-4b45-48ee-b725-289d894b1526" 00:17:15.082 ], 00:17:15.082 "product_name": "Malloc disk", 00:17:15.082 "block_size": 512, 00:17:15.082 "num_blocks": 65536, 00:17:15.082 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:15.082 "assigned_rate_limits": { 00:17:15.082 "rw_ios_per_sec": 0, 00:17:15.082 "rw_mbytes_per_sec": 0, 00:17:15.082 "r_mbytes_per_sec": 0, 00:17:15.082 "w_mbytes_per_sec": 0 00:17:15.082 }, 00:17:15.082 "claimed": true, 00:17:15.082 "claim_type": "exclusive_write", 00:17:15.082 "zoned": false, 00:17:15.082 "supported_io_types": { 00:17:15.082 "read": true, 00:17:15.082 "write": true, 00:17:15.082 "unmap": true, 00:17:15.082 "write_zeroes": true, 00:17:15.082 "flush": true, 00:17:15.082 "reset": true, 00:17:15.082 "compare": false, 00:17:15.082 "compare_and_write": false, 00:17:15.082 "abort": true, 00:17:15.082 "nvme_admin": false, 00:17:15.082 "nvme_io": false 00:17:15.082 }, 00:17:15.082 "memory_domains": [ 00:17:15.082 { 00:17:15.082 "dma_device_id": "system", 00:17:15.082 "dma_device_type": 1 00:17:15.082 }, 00:17:15.082 { 00:17:15.082 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:15.082 "dma_device_type": 2 00:17:15.082 } 00:17:15.082 ], 00:17:15.082 "driver_specific": {} 00:17:15.082 } 00:17:15.082 ] 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.082 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:15.340 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:15.340 "name": "Existed_Raid", 00:17:15.340 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:15.340 "strip_size_kb": 64, 00:17:15.340 "state": "configuring", 00:17:15.340 "raid_level": "raid0", 00:17:15.340 "superblock": true, 00:17:15.340 "num_base_bdevs": 4, 00:17:15.340 "num_base_bdevs_discovered": 3, 00:17:15.340 "num_base_bdevs_operational": 4, 00:17:15.340 "base_bdevs_list": [ 00:17:15.340 { 00:17:15.340 "name": "BaseBdev1", 00:17:15.340 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:15.340 "is_configured": true, 00:17:15.340 "data_offset": 2048, 00:17:15.340 "data_size": 63488 00:17:15.340 }, 00:17:15.340 { 00:17:15.340 "name": null, 00:17:15.340 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:15.340 "is_configured": false, 00:17:15.340 "data_offset": 2048, 00:17:15.340 "data_size": 63488 00:17:15.340 }, 00:17:15.340 { 00:17:15.340 "name": "BaseBdev3", 00:17:15.340 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:15.340 "is_configured": true, 00:17:15.340 "data_offset": 2048, 00:17:15.340 "data_size": 63488 00:17:15.340 }, 00:17:15.340 { 00:17:15.340 "name": "BaseBdev4", 00:17:15.340 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:15.340 "is_configured": true, 00:17:15.340 "data_offset": 2048, 00:17:15.340 "data_size": 63488 00:17:15.340 } 00:17:15.340 ] 00:17:15.340 }' 00:17:15.340 15:56:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:15.340 15:56:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:15.906 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:15.906 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:16.201 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:17:16.201 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:17:16.461 [2024-06-10 15:56:21.756698] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:16.461 15:56:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:16.719 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:16.719 "name": "Existed_Raid", 00:17:16.719 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:16.719 "strip_size_kb": 64, 00:17:16.719 "state": "configuring", 00:17:16.719 "raid_level": "raid0", 00:17:16.719 "superblock": true, 00:17:16.719 "num_base_bdevs": 4, 00:17:16.719 "num_base_bdevs_discovered": 2, 00:17:16.719 "num_base_bdevs_operational": 4, 00:17:16.719 "base_bdevs_list": [ 00:17:16.719 { 00:17:16.719 "name": "BaseBdev1", 00:17:16.719 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:16.719 "is_configured": true, 00:17:16.719 "data_offset": 2048, 00:17:16.719 "data_size": 63488 00:17:16.719 }, 00:17:16.719 { 00:17:16.719 "name": null, 00:17:16.719 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:16.719 "is_configured": false, 00:17:16.719 "data_offset": 2048, 00:17:16.719 "data_size": 63488 00:17:16.719 }, 00:17:16.719 { 00:17:16.719 "name": null, 00:17:16.719 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:16.719 "is_configured": false, 00:17:16.719 "data_offset": 2048, 00:17:16.719 "data_size": 63488 00:17:16.719 }, 00:17:16.719 { 00:17:16.719 "name": "BaseBdev4", 00:17:16.719 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:16.719 "is_configured": true, 00:17:16.719 "data_offset": 2048, 00:17:16.719 "data_size": 63488 00:17:16.719 } 00:17:16.719 ] 00:17:16.719 }' 00:17:16.719 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:16.719 15:56:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:17.285 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.285 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:17.543 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:17:17.543 15:56:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:17:17.800 [2024-06-10 15:56:23.148444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:17.800 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:17.801 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:17.801 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:18.058 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:18.058 "name": "Existed_Raid", 00:17:18.058 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:18.058 "strip_size_kb": 64, 00:17:18.058 "state": "configuring", 00:17:18.058 "raid_level": "raid0", 00:17:18.058 "superblock": true, 00:17:18.058 "num_base_bdevs": 4, 00:17:18.058 "num_base_bdevs_discovered": 3, 00:17:18.058 "num_base_bdevs_operational": 4, 00:17:18.058 "base_bdevs_list": [ 00:17:18.058 { 00:17:18.058 "name": "BaseBdev1", 00:17:18.058 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:18.058 "is_configured": true, 00:17:18.058 "data_offset": 2048, 00:17:18.058 "data_size": 63488 00:17:18.058 }, 00:17:18.058 { 00:17:18.058 "name": null, 00:17:18.058 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:18.058 "is_configured": false, 00:17:18.058 "data_offset": 2048, 00:17:18.058 "data_size": 63488 00:17:18.058 }, 00:17:18.058 { 00:17:18.058 "name": "BaseBdev3", 00:17:18.058 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:18.058 "is_configured": true, 00:17:18.058 "data_offset": 2048, 00:17:18.058 "data_size": 63488 00:17:18.058 }, 00:17:18.058 { 00:17:18.058 "name": "BaseBdev4", 00:17:18.058 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:18.058 "is_configured": true, 00:17:18.058 "data_offset": 2048, 00:17:18.058 "data_size": 63488 00:17:18.058 } 00:17:18.059 ] 00:17:18.059 }' 00:17:18.059 15:56:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:18.059 15:56:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:18.624 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:18.624 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:17:18.883 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:17:18.883 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:17:19.142 [2024-06-10 15:56:24.459953] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.142 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:19.401 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:19.401 "name": "Existed_Raid", 00:17:19.401 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:19.401 "strip_size_kb": 64, 00:17:19.401 "state": "configuring", 00:17:19.401 "raid_level": "raid0", 00:17:19.401 "superblock": true, 00:17:19.401 "num_base_bdevs": 4, 00:17:19.401 "num_base_bdevs_discovered": 2, 00:17:19.401 "num_base_bdevs_operational": 4, 00:17:19.401 "base_bdevs_list": [ 00:17:19.401 { 00:17:19.401 "name": null, 00:17:19.401 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:19.401 "is_configured": false, 00:17:19.401 "data_offset": 2048, 00:17:19.401 "data_size": 63488 00:17:19.401 }, 00:17:19.401 { 00:17:19.401 "name": null, 00:17:19.401 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:19.401 "is_configured": false, 00:17:19.401 "data_offset": 2048, 00:17:19.401 "data_size": 63488 00:17:19.401 }, 00:17:19.401 { 00:17:19.401 "name": "BaseBdev3", 00:17:19.401 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:19.401 "is_configured": true, 00:17:19.401 "data_offset": 2048, 00:17:19.401 "data_size": 63488 00:17:19.401 }, 00:17:19.401 { 00:17:19.401 "name": "BaseBdev4", 00:17:19.401 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:19.401 "is_configured": true, 00:17:19.401 "data_offset": 2048, 00:17:19.401 "data_size": 63488 00:17:19.401 } 00:17:19.401 ] 00:17:19.401 }' 00:17:19.401 15:56:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:19.401 15:56:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:19.969 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:19.969 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:17:20.228 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:17:20.228 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:17:20.487 [2024-06-10 15:56:25.870204] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid0 64 4 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:20.487 15:56:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:20.746 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:20.746 "name": "Existed_Raid", 00:17:20.746 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:20.746 "strip_size_kb": 64, 00:17:20.746 "state": "configuring", 00:17:20.746 "raid_level": "raid0", 00:17:20.746 "superblock": true, 00:17:20.746 "num_base_bdevs": 4, 00:17:20.746 "num_base_bdevs_discovered": 3, 00:17:20.746 "num_base_bdevs_operational": 4, 00:17:20.746 "base_bdevs_list": [ 00:17:20.746 { 00:17:20.746 "name": null, 00:17:20.746 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:20.746 "is_configured": false, 00:17:20.746 "data_offset": 2048, 00:17:20.746 "data_size": 63488 00:17:20.746 }, 00:17:20.746 { 00:17:20.746 "name": "BaseBdev2", 00:17:20.746 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:20.746 "is_configured": true, 00:17:20.746 "data_offset": 2048, 00:17:20.746 "data_size": 63488 00:17:20.746 }, 00:17:20.746 { 00:17:20.746 "name": "BaseBdev3", 00:17:20.746 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:20.746 "is_configured": true, 00:17:20.746 "data_offset": 2048, 00:17:20.746 "data_size": 63488 00:17:20.746 }, 00:17:20.746 { 00:17:20.746 "name": "BaseBdev4", 00:17:20.746 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:20.746 "is_configured": true, 00:17:20.746 "data_offset": 2048, 00:17:20.746 "data_size": 63488 00:17:20.746 } 00:17:20.746 ] 00:17:20.747 }' 00:17:20.747 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:20.747 15:56:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:21.313 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.313 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:17:21.571 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:17:21.572 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:17:21.572 15:56:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:21.831 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u b952dc92-4b45-48ee-b725-289d894b1526 00:17:22.090 [2024-06-10 15:56:27.494002] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:17:22.090 [2024-06-10 15:56:27.494160] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa093a0 00:17:22.090 [2024-06-10 15:56:27.494171] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:22.090 [2024-06-10 15:56:27.494353] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x8649c0 00:17:22.090 [2024-06-10 15:56:27.494471] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa093a0 00:17:22.090 [2024-06-10 15:56:27.494479] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa093a0 00:17:22.090 [2024-06-10 15:56:27.494569] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:22.090 NewBaseBdev 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:17:22.090 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:17:22.349 15:56:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:17:22.608 [ 00:17:22.608 { 00:17:22.608 "name": "NewBaseBdev", 00:17:22.608 "aliases": [ 00:17:22.608 "b952dc92-4b45-48ee-b725-289d894b1526" 00:17:22.608 ], 00:17:22.608 "product_name": "Malloc disk", 00:17:22.608 "block_size": 512, 00:17:22.608 "num_blocks": 65536, 00:17:22.608 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:22.608 "assigned_rate_limits": { 00:17:22.608 "rw_ios_per_sec": 0, 00:17:22.608 "rw_mbytes_per_sec": 0, 00:17:22.608 "r_mbytes_per_sec": 0, 00:17:22.608 "w_mbytes_per_sec": 0 00:17:22.608 }, 00:17:22.608 "claimed": true, 00:17:22.608 "claim_type": "exclusive_write", 00:17:22.608 "zoned": false, 00:17:22.608 "supported_io_types": { 00:17:22.608 "read": true, 00:17:22.608 "write": true, 00:17:22.608 "unmap": true, 00:17:22.608 "write_zeroes": true, 00:17:22.608 "flush": true, 00:17:22.608 "reset": true, 00:17:22.608 "compare": false, 00:17:22.608 "compare_and_write": false, 00:17:22.608 "abort": true, 00:17:22.608 "nvme_admin": false, 00:17:22.608 "nvme_io": false 00:17:22.608 }, 00:17:22.608 "memory_domains": [ 00:17:22.608 { 00:17:22.608 "dma_device_id": "system", 00:17:22.608 "dma_device_type": 1 00:17:22.608 }, 00:17:22.608 { 00:17:22.608 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:22.608 "dma_device_type": 2 00:17:22.608 } 00:17:22.608 ], 00:17:22.608 "driver_specific": {} 00:17:22.608 } 00:17:22.608 ] 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid0 64 4 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:22.608 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:17:22.867 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:22.867 "name": "Existed_Raid", 00:17:22.867 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:22.867 "strip_size_kb": 64, 00:17:22.867 "state": "online", 00:17:22.867 "raid_level": "raid0", 00:17:22.867 "superblock": true, 00:17:22.867 "num_base_bdevs": 4, 00:17:22.867 "num_base_bdevs_discovered": 4, 00:17:22.867 "num_base_bdevs_operational": 4, 00:17:22.867 "base_bdevs_list": [ 00:17:22.867 { 00:17:22.867 "name": "NewBaseBdev", 00:17:22.867 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:22.867 "is_configured": true, 00:17:22.867 "data_offset": 2048, 00:17:22.867 "data_size": 63488 00:17:22.867 }, 00:17:22.867 { 00:17:22.867 "name": "BaseBdev2", 00:17:22.867 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:22.867 "is_configured": true, 00:17:22.867 "data_offset": 2048, 00:17:22.867 "data_size": 63488 00:17:22.867 }, 00:17:22.867 { 00:17:22.867 "name": "BaseBdev3", 00:17:22.867 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:22.867 "is_configured": true, 00:17:22.867 "data_offset": 2048, 00:17:22.867 "data_size": 63488 00:17:22.867 }, 00:17:22.867 { 00:17:22.867 "name": "BaseBdev4", 00:17:22.868 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:22.868 "is_configured": true, 00:17:22.868 "data_offset": 2048, 00:17:22.868 "data_size": 63488 00:17:22.868 } 00:17:22.868 ] 00:17:22.868 }' 00:17:22.868 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:22.868 15:56:28 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:17:23.435 15:56:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:23.695 [2024-06-10 15:56:29.130752] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:23.695 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:23.695 "name": "Existed_Raid", 00:17:23.695 "aliases": [ 00:17:23.695 "7c5c6684-c27a-4498-9049-04e768617af7" 00:17:23.695 ], 00:17:23.695 "product_name": "Raid Volume", 00:17:23.695 "block_size": 512, 00:17:23.695 "num_blocks": 253952, 00:17:23.695 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:23.695 "assigned_rate_limits": { 00:17:23.695 "rw_ios_per_sec": 0, 00:17:23.695 "rw_mbytes_per_sec": 0, 00:17:23.695 "r_mbytes_per_sec": 0, 00:17:23.695 "w_mbytes_per_sec": 0 00:17:23.695 }, 00:17:23.695 "claimed": false, 00:17:23.695 "zoned": false, 00:17:23.695 "supported_io_types": { 00:17:23.695 "read": true, 00:17:23.695 "write": true, 00:17:23.695 "unmap": true, 00:17:23.695 "write_zeroes": true, 00:17:23.695 "flush": true, 00:17:23.695 "reset": true, 00:17:23.695 "compare": false, 00:17:23.695 "compare_and_write": false, 00:17:23.695 "abort": false, 00:17:23.695 "nvme_admin": false, 00:17:23.695 "nvme_io": false 00:17:23.695 }, 00:17:23.695 "memory_domains": [ 00:17:23.695 { 00:17:23.695 "dma_device_id": "system", 00:17:23.695 "dma_device_type": 1 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.695 "dma_device_type": 2 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "system", 00:17:23.695 "dma_device_type": 1 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.695 "dma_device_type": 2 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "system", 00:17:23.695 "dma_device_type": 1 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.695 "dma_device_type": 2 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "system", 00:17:23.695 "dma_device_type": 1 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:23.695 "dma_device_type": 2 00:17:23.695 } 00:17:23.695 ], 00:17:23.695 "driver_specific": { 00:17:23.695 "raid": { 00:17:23.695 "uuid": "7c5c6684-c27a-4498-9049-04e768617af7", 00:17:23.695 "strip_size_kb": 64, 00:17:23.695 "state": "online", 00:17:23.695 "raid_level": "raid0", 00:17:23.695 "superblock": true, 00:17:23.695 "num_base_bdevs": 4, 00:17:23.695 "num_base_bdevs_discovered": 4, 00:17:23.695 "num_base_bdevs_operational": 4, 00:17:23.695 "base_bdevs_list": [ 00:17:23.695 { 00:17:23.695 "name": "NewBaseBdev", 00:17:23.695 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:23.695 "is_configured": true, 00:17:23.695 "data_offset": 2048, 00:17:23.695 "data_size": 63488 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "name": "BaseBdev2", 00:17:23.695 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:23.695 "is_configured": true, 00:17:23.695 "data_offset": 2048, 00:17:23.695 "data_size": 63488 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "name": "BaseBdev3", 00:17:23.695 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:23.695 "is_configured": true, 00:17:23.695 "data_offset": 2048, 00:17:23.695 "data_size": 63488 00:17:23.695 }, 00:17:23.695 { 00:17:23.695 "name": "BaseBdev4", 00:17:23.695 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:23.695 "is_configured": true, 00:17:23.695 "data_offset": 2048, 00:17:23.695 "data_size": 63488 00:17:23.695 } 00:17:23.695 ] 00:17:23.695 } 00:17:23.695 } 00:17:23.695 }' 00:17:23.695 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:23.695 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:17:23.695 BaseBdev2 00:17:23.695 BaseBdev3 00:17:23.695 BaseBdev4' 00:17:23.955 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:23.955 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:17:23.955 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.214 "name": "NewBaseBdev", 00:17:24.214 "aliases": [ 00:17:24.214 "b952dc92-4b45-48ee-b725-289d894b1526" 00:17:24.214 ], 00:17:24.214 "product_name": "Malloc disk", 00:17:24.214 "block_size": 512, 00:17:24.214 "num_blocks": 65536, 00:17:24.214 "uuid": "b952dc92-4b45-48ee-b725-289d894b1526", 00:17:24.214 "assigned_rate_limits": { 00:17:24.214 "rw_ios_per_sec": 0, 00:17:24.214 "rw_mbytes_per_sec": 0, 00:17:24.214 "r_mbytes_per_sec": 0, 00:17:24.214 "w_mbytes_per_sec": 0 00:17:24.214 }, 00:17:24.214 "claimed": true, 00:17:24.214 "claim_type": "exclusive_write", 00:17:24.214 "zoned": false, 00:17:24.214 "supported_io_types": { 00:17:24.214 "read": true, 00:17:24.214 "write": true, 00:17:24.214 "unmap": true, 00:17:24.214 "write_zeroes": true, 00:17:24.214 "flush": true, 00:17:24.214 "reset": true, 00:17:24.214 "compare": false, 00:17:24.214 "compare_and_write": false, 00:17:24.214 "abort": true, 00:17:24.214 "nvme_admin": false, 00:17:24.214 "nvme_io": false 00:17:24.214 }, 00:17:24.214 "memory_domains": [ 00:17:24.214 { 00:17:24.214 "dma_device_id": "system", 00:17:24.214 "dma_device_type": 1 00:17:24.214 }, 00:17:24.214 { 00:17:24.214 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.214 "dma_device_type": 2 00:17:24.214 } 00:17:24.214 ], 00:17:24.214 "driver_specific": {} 00:17:24.214 }' 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.214 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:17:24.473 15:56:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:24.732 "name": "BaseBdev2", 00:17:24.732 "aliases": [ 00:17:24.732 "19c7cfbc-df3a-4319-b6dd-c07945240625" 00:17:24.732 ], 00:17:24.732 "product_name": "Malloc disk", 00:17:24.732 "block_size": 512, 00:17:24.732 "num_blocks": 65536, 00:17:24.732 "uuid": "19c7cfbc-df3a-4319-b6dd-c07945240625", 00:17:24.732 "assigned_rate_limits": { 00:17:24.732 "rw_ios_per_sec": 0, 00:17:24.732 "rw_mbytes_per_sec": 0, 00:17:24.732 "r_mbytes_per_sec": 0, 00:17:24.732 "w_mbytes_per_sec": 0 00:17:24.732 }, 00:17:24.732 "claimed": true, 00:17:24.732 "claim_type": "exclusive_write", 00:17:24.732 "zoned": false, 00:17:24.732 "supported_io_types": { 00:17:24.732 "read": true, 00:17:24.732 "write": true, 00:17:24.732 "unmap": true, 00:17:24.732 "write_zeroes": true, 00:17:24.732 "flush": true, 00:17:24.732 "reset": true, 00:17:24.732 "compare": false, 00:17:24.732 "compare_and_write": false, 00:17:24.732 "abort": true, 00:17:24.732 "nvme_admin": false, 00:17:24.732 "nvme_io": false 00:17:24.732 }, 00:17:24.732 "memory_domains": [ 00:17:24.732 { 00:17:24.732 "dma_device_id": "system", 00:17:24.732 "dma_device_type": 1 00:17:24.732 }, 00:17:24.732 { 00:17:24.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:24.732 "dma_device_type": 2 00:17:24.732 } 00:17:24.732 ], 00:17:24.732 "driver_specific": {} 00:17:24.732 }' 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.732 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:17:24.991 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:25.251 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:25.251 "name": "BaseBdev3", 00:17:25.251 "aliases": [ 00:17:25.251 "20876688-0a04-45c3-ab92-d94a460eef47" 00:17:25.251 ], 00:17:25.251 "product_name": "Malloc disk", 00:17:25.251 "block_size": 512, 00:17:25.251 "num_blocks": 65536, 00:17:25.251 "uuid": "20876688-0a04-45c3-ab92-d94a460eef47", 00:17:25.251 "assigned_rate_limits": { 00:17:25.251 "rw_ios_per_sec": 0, 00:17:25.251 "rw_mbytes_per_sec": 0, 00:17:25.251 "r_mbytes_per_sec": 0, 00:17:25.251 "w_mbytes_per_sec": 0 00:17:25.251 }, 00:17:25.251 "claimed": true, 00:17:25.251 "claim_type": "exclusive_write", 00:17:25.251 "zoned": false, 00:17:25.251 "supported_io_types": { 00:17:25.251 "read": true, 00:17:25.251 "write": true, 00:17:25.251 "unmap": true, 00:17:25.251 "write_zeroes": true, 00:17:25.251 "flush": true, 00:17:25.251 "reset": true, 00:17:25.251 "compare": false, 00:17:25.251 "compare_and_write": false, 00:17:25.251 "abort": true, 00:17:25.251 "nvme_admin": false, 00:17:25.251 "nvme_io": false 00:17:25.251 }, 00:17:25.251 "memory_domains": [ 00:17:25.251 { 00:17:25.251 "dma_device_id": "system", 00:17:25.251 "dma_device_type": 1 00:17:25.251 }, 00:17:25.251 { 00:17:25.251 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:25.251 "dma_device_type": 2 00:17:25.251 } 00:17:25.251 ], 00:17:25.251 "driver_specific": {} 00:17:25.251 }' 00:17:25.251 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.251 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:25.510 15:56:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.769 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:25.769 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:25.769 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:25.769 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:17:25.769 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:26.028 "name": "BaseBdev4", 00:17:26.028 "aliases": [ 00:17:26.028 "34e6807a-5660-4fb1-8574-c4356c5e97a2" 00:17:26.028 ], 00:17:26.028 "product_name": "Malloc disk", 00:17:26.028 "block_size": 512, 00:17:26.028 "num_blocks": 65536, 00:17:26.028 "uuid": "34e6807a-5660-4fb1-8574-c4356c5e97a2", 00:17:26.028 "assigned_rate_limits": { 00:17:26.028 "rw_ios_per_sec": 0, 00:17:26.028 "rw_mbytes_per_sec": 0, 00:17:26.028 "r_mbytes_per_sec": 0, 00:17:26.028 "w_mbytes_per_sec": 0 00:17:26.028 }, 00:17:26.028 "claimed": true, 00:17:26.028 "claim_type": "exclusive_write", 00:17:26.028 "zoned": false, 00:17:26.028 "supported_io_types": { 00:17:26.028 "read": true, 00:17:26.028 "write": true, 00:17:26.028 "unmap": true, 00:17:26.028 "write_zeroes": true, 00:17:26.028 "flush": true, 00:17:26.028 "reset": true, 00:17:26.028 "compare": false, 00:17:26.028 "compare_and_write": false, 00:17:26.028 "abort": true, 00:17:26.028 "nvme_admin": false, 00:17:26.028 "nvme_io": false 00:17:26.028 }, 00:17:26.028 "memory_domains": [ 00:17:26.028 { 00:17:26.028 "dma_device_id": "system", 00:17:26.028 "dma_device_type": 1 00:17:26.028 }, 00:17:26.028 { 00:17:26.028 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:26.028 "dma_device_type": 2 00:17:26.028 } 00:17:26.028 ], 00:17:26.028 "driver_specific": {} 00:17:26.028 }' 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:26.028 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:26.286 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:17:26.544 [2024-06-10 15:56:31.937982] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:17:26.544 [2024-06-10 15:56:31.938011] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:26.544 [2024-06-10 15:56:31.938062] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:26.545 [2024-06-10 15:56:31.938125] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:26.545 [2024-06-10 15:56:31.938134] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa093a0 name Existed_Raid, state offline 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2712558 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2712558 ']' 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2712558 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2712558 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:26.545 15:56:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:26.545 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2712558' 00:17:26.545 killing process with pid 2712558 00:17:26.545 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2712558 00:17:26.545 [2024-06-10 15:56:32.001626] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:26.545 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2712558 00:17:26.545 [2024-06-10 15:56:32.034772] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:26.802 15:56:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:17:26.802 00:17:26.802 real 0m32.410s 00:17:26.802 user 1m0.668s 00:17:26.802 sys 0m4.647s 00:17:26.802 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:26.802 15:56:32 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:17:26.802 ************************************ 00:17:26.802 END TEST raid_state_function_test_sb 00:17:26.802 ************************************ 00:17:26.802 15:56:32 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid0 4 00:17:26.802 15:56:32 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:17:26.802 15:56:32 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:26.802 15:56:32 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:26.802 ************************************ 00:17:26.803 START TEST raid_superblock_test 00:17:26.803 ************************************ 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid0 4 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid0 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid0 '!=' raid1 ']' 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2718506 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2718506 /var/tmp/spdk-raid.sock 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2718506 ']' 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:26.803 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:26.803 15:56:32 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:27.061 [2024-06-10 15:56:32.357501] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:17:27.061 [2024-06-10 15:56:32.357554] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2718506 ] 00:17:27.061 [2024-06-10 15:56:32.458150] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:27.061 [2024-06-10 15:56:32.552746] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:27.320 [2024-06-10 15:56:32.619442] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:27.320 [2024-06-10 15:56:32.619475] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:27.887 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:17:28.145 malloc1 00:17:28.145 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:28.404 [2024-06-10 15:56:33.813841] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:28.404 [2024-06-10 15:56:33.813886] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.404 [2024-06-10 15:56:33.813904] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10ae0f0 00:17:28.404 [2024-06-10 15:56:33.813914] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.404 [2024-06-10 15:56:33.815641] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.404 [2024-06-10 15:56:33.815670] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:28.404 pt1 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:28.404 15:56:33 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:17:28.663 malloc2 00:17:28.663 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:28.922 [2024-06-10 15:56:34.332010] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:28.923 [2024-06-10 15:56:34.332052] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:28.923 [2024-06-10 15:56:34.332067] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10af400 00:17:28.923 [2024-06-10 15:56:34.332076] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:28.923 [2024-06-10 15:56:34.333621] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:28.923 [2024-06-10 15:56:34.333647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:28.923 pt2 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:28.923 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:17:29.182 malloc3 00:17:29.182 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:29.440 [2024-06-10 15:56:34.849969] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:29.440 [2024-06-10 15:56:34.850011] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:29.440 [2024-06-10 15:56:34.850026] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125b200 00:17:29.440 [2024-06-10 15:56:34.850035] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:29.440 [2024-06-10 15:56:34.851571] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:29.440 [2024-06-10 15:56:34.851597] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:29.440 pt3 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:17:29.440 15:56:34 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:17:29.700 malloc4 00:17:29.700 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:29.976 [2024-06-10 15:56:35.355738] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:29.976 [2024-06-10 15:56:35.355778] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:29.976 [2024-06-10 15:56:35.355793] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125d320 00:17:29.976 [2024-06-10 15:56:35.355802] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:29.976 [2024-06-10 15:56:35.357339] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:29.976 [2024-06-10 15:56:35.357365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:29.976 pt4 00:17:29.976 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:17:29.976 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:17:29.976 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:17:30.337 [2024-06-10 15:56:35.608431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:30.337 [2024-06-10 15:56:35.609761] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:30.337 [2024-06-10 15:56:35.609818] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:30.337 [2024-06-10 15:56:35.609864] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:30.337 [2024-06-10 15:56:35.610047] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x125f660 00:17:30.337 [2024-06-10 15:56:35.610057] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:30.337 [2024-06-10 15:56:35.610257] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10c4fe0 00:17:30.337 [2024-06-10 15:56:35.610404] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x125f660 00:17:30.337 [2024-06-10 15:56:35.610413] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x125f660 00:17:30.337 [2024-06-10 15:56:35.610511] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:30.337 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:30.596 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:30.596 "name": "raid_bdev1", 00:17:30.596 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:30.596 "strip_size_kb": 64, 00:17:30.596 "state": "online", 00:17:30.596 "raid_level": "raid0", 00:17:30.596 "superblock": true, 00:17:30.596 "num_base_bdevs": 4, 00:17:30.596 "num_base_bdevs_discovered": 4, 00:17:30.596 "num_base_bdevs_operational": 4, 00:17:30.596 "base_bdevs_list": [ 00:17:30.596 { 00:17:30.596 "name": "pt1", 00:17:30.596 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:30.596 "is_configured": true, 00:17:30.596 "data_offset": 2048, 00:17:30.596 "data_size": 63488 00:17:30.596 }, 00:17:30.596 { 00:17:30.596 "name": "pt2", 00:17:30.596 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:30.596 "is_configured": true, 00:17:30.596 "data_offset": 2048, 00:17:30.596 "data_size": 63488 00:17:30.596 }, 00:17:30.596 { 00:17:30.596 "name": "pt3", 00:17:30.596 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:30.596 "is_configured": true, 00:17:30.596 "data_offset": 2048, 00:17:30.596 "data_size": 63488 00:17:30.596 }, 00:17:30.596 { 00:17:30.596 "name": "pt4", 00:17:30.596 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:30.596 "is_configured": true, 00:17:30.596 "data_offset": 2048, 00:17:30.596 "data_size": 63488 00:17:30.596 } 00:17:30.596 ] 00:17:30.596 }' 00:17:30.596 15:56:35 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:30.596 15:56:35 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:31.164 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:31.423 [2024-06-10 15:56:36.731690] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:31.423 "name": "raid_bdev1", 00:17:31.423 "aliases": [ 00:17:31.423 "166817a2-6fb4-4597-b908-d48c8bf16265" 00:17:31.423 ], 00:17:31.423 "product_name": "Raid Volume", 00:17:31.423 "block_size": 512, 00:17:31.423 "num_blocks": 253952, 00:17:31.423 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:31.423 "assigned_rate_limits": { 00:17:31.423 "rw_ios_per_sec": 0, 00:17:31.423 "rw_mbytes_per_sec": 0, 00:17:31.423 "r_mbytes_per_sec": 0, 00:17:31.423 "w_mbytes_per_sec": 0 00:17:31.423 }, 00:17:31.423 "claimed": false, 00:17:31.423 "zoned": false, 00:17:31.423 "supported_io_types": { 00:17:31.423 "read": true, 00:17:31.423 "write": true, 00:17:31.423 "unmap": true, 00:17:31.423 "write_zeroes": true, 00:17:31.423 "flush": true, 00:17:31.423 "reset": true, 00:17:31.423 "compare": false, 00:17:31.423 "compare_and_write": false, 00:17:31.423 "abort": false, 00:17:31.423 "nvme_admin": false, 00:17:31.423 "nvme_io": false 00:17:31.423 }, 00:17:31.423 "memory_domains": [ 00:17:31.423 { 00:17:31.423 "dma_device_id": "system", 00:17:31.423 "dma_device_type": 1 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.423 "dma_device_type": 2 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "system", 00:17:31.423 "dma_device_type": 1 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.423 "dma_device_type": 2 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "system", 00:17:31.423 "dma_device_type": 1 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.423 "dma_device_type": 2 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "system", 00:17:31.423 "dma_device_type": 1 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.423 "dma_device_type": 2 00:17:31.423 } 00:17:31.423 ], 00:17:31.423 "driver_specific": { 00:17:31.423 "raid": { 00:17:31.423 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:31.423 "strip_size_kb": 64, 00:17:31.423 "state": "online", 00:17:31.423 "raid_level": "raid0", 00:17:31.423 "superblock": true, 00:17:31.423 "num_base_bdevs": 4, 00:17:31.423 "num_base_bdevs_discovered": 4, 00:17:31.423 "num_base_bdevs_operational": 4, 00:17:31.423 "base_bdevs_list": [ 00:17:31.423 { 00:17:31.423 "name": "pt1", 00:17:31.423 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:31.423 "is_configured": true, 00:17:31.423 "data_offset": 2048, 00:17:31.423 "data_size": 63488 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "name": "pt2", 00:17:31.423 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:31.423 "is_configured": true, 00:17:31.423 "data_offset": 2048, 00:17:31.423 "data_size": 63488 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "name": "pt3", 00:17:31.423 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:31.423 "is_configured": true, 00:17:31.423 "data_offset": 2048, 00:17:31.423 "data_size": 63488 00:17:31.423 }, 00:17:31.423 { 00:17:31.423 "name": "pt4", 00:17:31.423 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:31.423 "is_configured": true, 00:17:31.423 "data_offset": 2048, 00:17:31.423 "data_size": 63488 00:17:31.423 } 00:17:31.423 ] 00:17:31.423 } 00:17:31.423 } 00:17:31.423 }' 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:31.423 pt2 00:17:31.423 pt3 00:17:31.423 pt4' 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:31.423 15:56:36 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:31.682 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:31.682 "name": "pt1", 00:17:31.682 "aliases": [ 00:17:31.682 "00000000-0000-0000-0000-000000000001" 00:17:31.682 ], 00:17:31.682 "product_name": "passthru", 00:17:31.682 "block_size": 512, 00:17:31.682 "num_blocks": 65536, 00:17:31.682 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:31.682 "assigned_rate_limits": { 00:17:31.682 "rw_ios_per_sec": 0, 00:17:31.682 "rw_mbytes_per_sec": 0, 00:17:31.682 "r_mbytes_per_sec": 0, 00:17:31.682 "w_mbytes_per_sec": 0 00:17:31.682 }, 00:17:31.682 "claimed": true, 00:17:31.682 "claim_type": "exclusive_write", 00:17:31.682 "zoned": false, 00:17:31.682 "supported_io_types": { 00:17:31.682 "read": true, 00:17:31.682 "write": true, 00:17:31.682 "unmap": true, 00:17:31.682 "write_zeroes": true, 00:17:31.682 "flush": true, 00:17:31.682 "reset": true, 00:17:31.682 "compare": false, 00:17:31.682 "compare_and_write": false, 00:17:31.682 "abort": true, 00:17:31.682 "nvme_admin": false, 00:17:31.682 "nvme_io": false 00:17:31.682 }, 00:17:31.682 "memory_domains": [ 00:17:31.682 { 00:17:31.682 "dma_device_id": "system", 00:17:31.682 "dma_device_type": 1 00:17:31.682 }, 00:17:31.682 { 00:17:31.682 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:31.682 "dma_device_type": 2 00:17:31.682 } 00:17:31.682 ], 00:17:31.682 "driver_specific": { 00:17:31.682 "passthru": { 00:17:31.682 "name": "pt1", 00:17:31.682 "base_bdev_name": "malloc1" 00:17:31.682 } 00:17:31.682 } 00:17:31.682 }' 00:17:31.682 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.682 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:31.682 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:31.682 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:31.941 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.199 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.199 "name": "pt2", 00:17:32.199 "aliases": [ 00:17:32.199 "00000000-0000-0000-0000-000000000002" 00:17:32.199 ], 00:17:32.199 "product_name": "passthru", 00:17:32.199 "block_size": 512, 00:17:32.199 "num_blocks": 65536, 00:17:32.199 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:32.199 "assigned_rate_limits": { 00:17:32.199 "rw_ios_per_sec": 0, 00:17:32.199 "rw_mbytes_per_sec": 0, 00:17:32.199 "r_mbytes_per_sec": 0, 00:17:32.199 "w_mbytes_per_sec": 0 00:17:32.199 }, 00:17:32.199 "claimed": true, 00:17:32.199 "claim_type": "exclusive_write", 00:17:32.199 "zoned": false, 00:17:32.199 "supported_io_types": { 00:17:32.199 "read": true, 00:17:32.199 "write": true, 00:17:32.199 "unmap": true, 00:17:32.199 "write_zeroes": true, 00:17:32.199 "flush": true, 00:17:32.199 "reset": true, 00:17:32.199 "compare": false, 00:17:32.199 "compare_and_write": false, 00:17:32.199 "abort": true, 00:17:32.199 "nvme_admin": false, 00:17:32.199 "nvme_io": false 00:17:32.199 }, 00:17:32.199 "memory_domains": [ 00:17:32.199 { 00:17:32.199 "dma_device_id": "system", 00:17:32.199 "dma_device_type": 1 00:17:32.199 }, 00:17:32.199 { 00:17:32.199 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.199 "dma_device_type": 2 00:17:32.199 } 00:17:32.199 ], 00:17:32.199 "driver_specific": { 00:17:32.199 "passthru": { 00:17:32.199 "name": "pt2", 00:17:32.200 "base_bdev_name": "malloc2" 00:17:32.200 } 00:17:32.200 } 00:17:32.200 }' 00:17:32.200 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.458 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:32.716 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:32.716 15:56:37 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.716 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:32.716 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:32.716 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:32.716 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:32.716 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:32.975 "name": "pt3", 00:17:32.975 "aliases": [ 00:17:32.975 "00000000-0000-0000-0000-000000000003" 00:17:32.975 ], 00:17:32.975 "product_name": "passthru", 00:17:32.975 "block_size": 512, 00:17:32.975 "num_blocks": 65536, 00:17:32.975 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:32.975 "assigned_rate_limits": { 00:17:32.975 "rw_ios_per_sec": 0, 00:17:32.975 "rw_mbytes_per_sec": 0, 00:17:32.975 "r_mbytes_per_sec": 0, 00:17:32.975 "w_mbytes_per_sec": 0 00:17:32.975 }, 00:17:32.975 "claimed": true, 00:17:32.975 "claim_type": "exclusive_write", 00:17:32.975 "zoned": false, 00:17:32.975 "supported_io_types": { 00:17:32.975 "read": true, 00:17:32.975 "write": true, 00:17:32.975 "unmap": true, 00:17:32.975 "write_zeroes": true, 00:17:32.975 "flush": true, 00:17:32.975 "reset": true, 00:17:32.975 "compare": false, 00:17:32.975 "compare_and_write": false, 00:17:32.975 "abort": true, 00:17:32.975 "nvme_admin": false, 00:17:32.975 "nvme_io": false 00:17:32.975 }, 00:17:32.975 "memory_domains": [ 00:17:32.975 { 00:17:32.975 "dma_device_id": "system", 00:17:32.975 "dma_device_type": 1 00:17:32.975 }, 00:17:32.975 { 00:17:32.975 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:32.975 "dma_device_type": 2 00:17:32.975 } 00:17:32.975 ], 00:17:32.975 "driver_specific": { 00:17:32.975 "passthru": { 00:17:32.975 "name": "pt3", 00:17:32.975 "base_bdev_name": "malloc3" 00:17:32.975 } 00:17:32.975 } 00:17:32.975 }' 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:32.975 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.233 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.233 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:33.234 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:33.493 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:33.493 "name": "pt4", 00:17:33.493 "aliases": [ 00:17:33.493 "00000000-0000-0000-0000-000000000004" 00:17:33.493 ], 00:17:33.493 "product_name": "passthru", 00:17:33.493 "block_size": 512, 00:17:33.493 "num_blocks": 65536, 00:17:33.493 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:33.493 "assigned_rate_limits": { 00:17:33.493 "rw_ios_per_sec": 0, 00:17:33.493 "rw_mbytes_per_sec": 0, 00:17:33.493 "r_mbytes_per_sec": 0, 00:17:33.493 "w_mbytes_per_sec": 0 00:17:33.493 }, 00:17:33.493 "claimed": true, 00:17:33.493 "claim_type": "exclusive_write", 00:17:33.493 "zoned": false, 00:17:33.493 "supported_io_types": { 00:17:33.493 "read": true, 00:17:33.493 "write": true, 00:17:33.493 "unmap": true, 00:17:33.493 "write_zeroes": true, 00:17:33.493 "flush": true, 00:17:33.493 "reset": true, 00:17:33.493 "compare": false, 00:17:33.493 "compare_and_write": false, 00:17:33.493 "abort": true, 00:17:33.493 "nvme_admin": false, 00:17:33.493 "nvme_io": false 00:17:33.493 }, 00:17:33.493 "memory_domains": [ 00:17:33.493 { 00:17:33.493 "dma_device_id": "system", 00:17:33.493 "dma_device_type": 1 00:17:33.493 }, 00:17:33.493 { 00:17:33.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:33.493 "dma_device_type": 2 00:17:33.493 } 00:17:33.493 ], 00:17:33.493 "driver_specific": { 00:17:33.493 "passthru": { 00:17:33.493 "name": "pt4", 00:17:33.493 "base_bdev_name": "malloc4" 00:17:33.493 } 00:17:33.493 } 00:17:33.493 }' 00:17:33.493 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.493 15:56:38 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:33.752 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.010 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:34.010 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:34.010 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:34.010 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:17:34.268 [2024-06-10 15:56:39.547249] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:34.268 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=166817a2-6fb4-4597-b908-d48c8bf16265 00:17:34.268 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 166817a2-6fb4-4597-b908-d48c8bf16265 ']' 00:17:34.268 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:34.526 [2024-06-10 15:56:39.799622] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:34.526 [2024-06-10 15:56:39.799642] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:34.526 [2024-06-10 15:56:39.799694] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:34.526 [2024-06-10 15:56:39.799758] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:34.526 [2024-06-10 15:56:39.799767] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125f660 name raid_bdev1, state offline 00:17:34.526 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:34.526 15:56:39 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:17:34.784 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:17:34.784 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:17:34.784 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:34.784 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:17:35.044 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:35.044 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:35.303 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:35.303 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:17:35.561 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:17:35.561 15:56:40 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:17:35.820 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:17:35.820 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:17:36.079 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:17:36.079 [2024-06-10 15:56:41.588413] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:17:36.339 [2024-06-10 15:56:41.589833] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:17:36.339 [2024-06-10 15:56:41.589878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:17:36.339 [2024-06-10 15:56:41.589913] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:17:36.339 [2024-06-10 15:56:41.589966] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:17:36.339 [2024-06-10 15:56:41.590002] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:17:36.339 [2024-06-10 15:56:41.590022] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:17:36.339 [2024-06-10 15:56:41.590041] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:17:36.339 [2024-06-10 15:56:41.590055] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:36.339 [2024-06-10 15:56:41.590063] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x125db70 name raid_bdev1, state configuring 00:17:36.339 request: 00:17:36.339 { 00:17:36.339 "name": "raid_bdev1", 00:17:36.339 "raid_level": "raid0", 00:17:36.339 "base_bdevs": [ 00:17:36.339 "malloc1", 00:17:36.339 "malloc2", 00:17:36.339 "malloc3", 00:17:36.339 "malloc4" 00:17:36.339 ], 00:17:36.339 "superblock": false, 00:17:36.339 "strip_size_kb": 64, 00:17:36.339 "method": "bdev_raid_create", 00:17:36.339 "req_id": 1 00:17:36.339 } 00:17:36.339 Got JSON-RPC error response 00:17:36.339 response: 00:17:36.339 { 00:17:36.339 "code": -17, 00:17:36.339 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:17:36.339 } 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.339 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:17:36.598 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:17:36.598 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:17:36.598 15:56:41 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:17:36.598 [2024-06-10 15:56:42.085663] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:17:36.598 [2024-06-10 15:56:42.085700] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:36.598 [2024-06-10 15:56:42.085716] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125cd20 00:17:36.598 [2024-06-10 15:56:42.085726] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:36.598 [2024-06-10 15:56:42.087388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:36.598 [2024-06-10 15:56:42.087415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:17:36.598 [2024-06-10 15:56:42.087475] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:17:36.598 [2024-06-10 15:56:42.087500] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:17:36.598 pt1 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:36.598 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:36.855 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:36.855 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:36.855 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:36.855 "name": "raid_bdev1", 00:17:36.855 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:36.855 "strip_size_kb": 64, 00:17:36.855 "state": "configuring", 00:17:36.855 "raid_level": "raid0", 00:17:36.855 "superblock": true, 00:17:36.855 "num_base_bdevs": 4, 00:17:36.855 "num_base_bdevs_discovered": 1, 00:17:36.855 "num_base_bdevs_operational": 4, 00:17:36.855 "base_bdevs_list": [ 00:17:36.855 { 00:17:36.855 "name": "pt1", 00:17:36.855 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:36.855 "is_configured": true, 00:17:36.855 "data_offset": 2048, 00:17:36.855 "data_size": 63488 00:17:36.855 }, 00:17:36.855 { 00:17:36.855 "name": null, 00:17:36.855 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:36.855 "is_configured": false, 00:17:36.855 "data_offset": 2048, 00:17:36.855 "data_size": 63488 00:17:36.855 }, 00:17:36.855 { 00:17:36.855 "name": null, 00:17:36.855 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:36.855 "is_configured": false, 00:17:36.855 "data_offset": 2048, 00:17:36.855 "data_size": 63488 00:17:36.855 }, 00:17:36.855 { 00:17:36.855 "name": null, 00:17:36.855 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:36.855 "is_configured": false, 00:17:36.855 "data_offset": 2048, 00:17:36.855 "data_size": 63488 00:17:36.855 } 00:17:36.855 ] 00:17:36.855 }' 00:17:36.855 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:36.855 15:56:42 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:37.792 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:17:37.792 15:56:42 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:37.792 [2024-06-10 15:56:43.232740] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:37.792 [2024-06-10 15:56:43.232785] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:37.792 [2024-06-10 15:56:43.232800] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1258b50 00:17:37.792 [2024-06-10 15:56:43.232809] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:37.792 [2024-06-10 15:56:43.233156] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:37.792 [2024-06-10 15:56:43.233173] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:37.792 [2024-06-10 15:56:43.233231] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:37.792 [2024-06-10 15:56:43.233249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:37.792 pt2 00:17:37.792 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:17:38.051 [2024-06-10 15:56:43.489452] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid0 64 4 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:38.051 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:38.052 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:38.310 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:38.310 "name": "raid_bdev1", 00:17:38.310 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:38.310 "strip_size_kb": 64, 00:17:38.310 "state": "configuring", 00:17:38.310 "raid_level": "raid0", 00:17:38.310 "superblock": true, 00:17:38.310 "num_base_bdevs": 4, 00:17:38.310 "num_base_bdevs_discovered": 1, 00:17:38.310 "num_base_bdevs_operational": 4, 00:17:38.310 "base_bdevs_list": [ 00:17:38.310 { 00:17:38.310 "name": "pt1", 00:17:38.310 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:38.310 "is_configured": true, 00:17:38.310 "data_offset": 2048, 00:17:38.310 "data_size": 63488 00:17:38.310 }, 00:17:38.310 { 00:17:38.310 "name": null, 00:17:38.310 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:38.310 "is_configured": false, 00:17:38.310 "data_offset": 2048, 00:17:38.310 "data_size": 63488 00:17:38.310 }, 00:17:38.310 { 00:17:38.310 "name": null, 00:17:38.310 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:38.310 "is_configured": false, 00:17:38.310 "data_offset": 2048, 00:17:38.310 "data_size": 63488 00:17:38.310 }, 00:17:38.310 { 00:17:38.310 "name": null, 00:17:38.310 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:38.310 "is_configured": false, 00:17:38.310 "data_offset": 2048, 00:17:38.310 "data_size": 63488 00:17:38.310 } 00:17:38.310 ] 00:17:38.310 }' 00:17:38.310 15:56:43 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:38.310 15:56:43 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:39.244 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:17:39.244 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:39.244 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:17:39.244 [2024-06-10 15:56:44.628496] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:17:39.244 [2024-06-10 15:56:44.628539] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:39.244 [2024-06-10 15:56:44.628555] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125c3c0 00:17:39.244 [2024-06-10 15:56:44.628563] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:39.245 [2024-06-10 15:56:44.628900] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:39.245 [2024-06-10 15:56:44.628916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:17:39.245 [2024-06-10 15:56:44.628985] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:17:39.245 [2024-06-10 15:56:44.629004] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:17:39.245 pt2 00:17:39.245 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:39.245 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:39.245 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:17:39.502 [2024-06-10 15:56:44.885191] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:17:39.502 [2024-06-10 15:56:44.885227] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:39.503 [2024-06-10 15:56:44.885248] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12583f0 00:17:39.503 [2024-06-10 15:56:44.885257] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:39.503 [2024-06-10 15:56:44.885565] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:39.503 [2024-06-10 15:56:44.885580] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:17:39.503 [2024-06-10 15:56:44.885632] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:17:39.503 [2024-06-10 15:56:44.885649] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:17:39.503 pt3 00:17:39.503 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:39.503 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:39.503 15:56:44 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:17:39.762 [2024-06-10 15:56:45.145883] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:17:39.762 [2024-06-10 15:56:45.145915] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:39.762 [2024-06-10 15:56:45.145928] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x125e940 00:17:39.762 [2024-06-10 15:56:45.145937] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:39.762 [2024-06-10 15:56:45.146248] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:39.762 [2024-06-10 15:56:45.146264] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:17:39.762 [2024-06-10 15:56:45.146312] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:17:39.762 [2024-06-10 15:56:45.146328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:17:39.762 [2024-06-10 15:56:45.146448] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1258e70 00:17:39.762 [2024-06-10 15:56:45.146457] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:39.762 [2024-06-10 15:56:45.146626] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x12576c0 00:17:39.762 [2024-06-10 15:56:45.146758] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1258e70 00:17:39.762 [2024-06-10 15:56:45.146766] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1258e70 00:17:39.762 [2024-06-10 15:56:45.146861] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:39.762 pt4 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:39.762 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:40.020 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:40.020 "name": "raid_bdev1", 00:17:40.020 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:40.020 "strip_size_kb": 64, 00:17:40.020 "state": "online", 00:17:40.020 "raid_level": "raid0", 00:17:40.020 "superblock": true, 00:17:40.020 "num_base_bdevs": 4, 00:17:40.020 "num_base_bdevs_discovered": 4, 00:17:40.020 "num_base_bdevs_operational": 4, 00:17:40.020 "base_bdevs_list": [ 00:17:40.020 { 00:17:40.020 "name": "pt1", 00:17:40.020 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:40.020 "is_configured": true, 00:17:40.020 "data_offset": 2048, 00:17:40.020 "data_size": 63488 00:17:40.020 }, 00:17:40.020 { 00:17:40.020 "name": "pt2", 00:17:40.020 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:40.020 "is_configured": true, 00:17:40.020 "data_offset": 2048, 00:17:40.020 "data_size": 63488 00:17:40.020 }, 00:17:40.020 { 00:17:40.020 "name": "pt3", 00:17:40.020 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:40.020 "is_configured": true, 00:17:40.020 "data_offset": 2048, 00:17:40.020 "data_size": 63488 00:17:40.020 }, 00:17:40.020 { 00:17:40.020 "name": "pt4", 00:17:40.020 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:40.020 "is_configured": true, 00:17:40.020 "data_offset": 2048, 00:17:40.020 "data_size": 63488 00:17:40.020 } 00:17:40.020 ] 00:17:40.020 }' 00:17:40.020 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:40.020 15:56:45 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:40.588 15:56:45 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:17:40.847 [2024-06-10 15:56:46.205036] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:40.847 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:17:40.847 "name": "raid_bdev1", 00:17:40.847 "aliases": [ 00:17:40.847 "166817a2-6fb4-4597-b908-d48c8bf16265" 00:17:40.847 ], 00:17:40.847 "product_name": "Raid Volume", 00:17:40.847 "block_size": 512, 00:17:40.847 "num_blocks": 253952, 00:17:40.847 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:40.847 "assigned_rate_limits": { 00:17:40.847 "rw_ios_per_sec": 0, 00:17:40.847 "rw_mbytes_per_sec": 0, 00:17:40.847 "r_mbytes_per_sec": 0, 00:17:40.847 "w_mbytes_per_sec": 0 00:17:40.847 }, 00:17:40.847 "claimed": false, 00:17:40.847 "zoned": false, 00:17:40.847 "supported_io_types": { 00:17:40.847 "read": true, 00:17:40.847 "write": true, 00:17:40.847 "unmap": true, 00:17:40.847 "write_zeroes": true, 00:17:40.848 "flush": true, 00:17:40.848 "reset": true, 00:17:40.848 "compare": false, 00:17:40.848 "compare_and_write": false, 00:17:40.848 "abort": false, 00:17:40.848 "nvme_admin": false, 00:17:40.848 "nvme_io": false 00:17:40.848 }, 00:17:40.848 "memory_domains": [ 00:17:40.848 { 00:17:40.848 "dma_device_id": "system", 00:17:40.848 "dma_device_type": 1 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.848 "dma_device_type": 2 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "system", 00:17:40.848 "dma_device_type": 1 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.848 "dma_device_type": 2 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "system", 00:17:40.848 "dma_device_type": 1 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.848 "dma_device_type": 2 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "system", 00:17:40.848 "dma_device_type": 1 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:40.848 "dma_device_type": 2 00:17:40.848 } 00:17:40.848 ], 00:17:40.848 "driver_specific": { 00:17:40.848 "raid": { 00:17:40.848 "uuid": "166817a2-6fb4-4597-b908-d48c8bf16265", 00:17:40.848 "strip_size_kb": 64, 00:17:40.848 "state": "online", 00:17:40.848 "raid_level": "raid0", 00:17:40.848 "superblock": true, 00:17:40.848 "num_base_bdevs": 4, 00:17:40.848 "num_base_bdevs_discovered": 4, 00:17:40.848 "num_base_bdevs_operational": 4, 00:17:40.848 "base_bdevs_list": [ 00:17:40.848 { 00:17:40.848 "name": "pt1", 00:17:40.848 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:40.848 "is_configured": true, 00:17:40.848 "data_offset": 2048, 00:17:40.848 "data_size": 63488 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "name": "pt2", 00:17:40.848 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:40.848 "is_configured": true, 00:17:40.848 "data_offset": 2048, 00:17:40.848 "data_size": 63488 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "name": "pt3", 00:17:40.848 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:40.848 "is_configured": true, 00:17:40.848 "data_offset": 2048, 00:17:40.848 "data_size": 63488 00:17:40.848 }, 00:17:40.848 { 00:17:40.848 "name": "pt4", 00:17:40.848 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:40.848 "is_configured": true, 00:17:40.848 "data_offset": 2048, 00:17:40.848 "data_size": 63488 00:17:40.848 } 00:17:40.848 ] 00:17:40.848 } 00:17:40.848 } 00:17:40.848 }' 00:17:40.848 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:17:40.848 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:17:40.848 pt2 00:17:40.848 pt3 00:17:40.848 pt4' 00:17:40.848 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:40.848 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:17:40.848 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.107 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.107 "name": "pt1", 00:17:41.107 "aliases": [ 00:17:41.107 "00000000-0000-0000-0000-000000000001" 00:17:41.107 ], 00:17:41.107 "product_name": "passthru", 00:17:41.107 "block_size": 512, 00:17:41.107 "num_blocks": 65536, 00:17:41.107 "uuid": "00000000-0000-0000-0000-000000000001", 00:17:41.107 "assigned_rate_limits": { 00:17:41.107 "rw_ios_per_sec": 0, 00:17:41.107 "rw_mbytes_per_sec": 0, 00:17:41.107 "r_mbytes_per_sec": 0, 00:17:41.107 "w_mbytes_per_sec": 0 00:17:41.107 }, 00:17:41.107 "claimed": true, 00:17:41.107 "claim_type": "exclusive_write", 00:17:41.107 "zoned": false, 00:17:41.107 "supported_io_types": { 00:17:41.107 "read": true, 00:17:41.107 "write": true, 00:17:41.107 "unmap": true, 00:17:41.107 "write_zeroes": true, 00:17:41.107 "flush": true, 00:17:41.107 "reset": true, 00:17:41.107 "compare": false, 00:17:41.107 "compare_and_write": false, 00:17:41.107 "abort": true, 00:17:41.107 "nvme_admin": false, 00:17:41.107 "nvme_io": false 00:17:41.107 }, 00:17:41.107 "memory_domains": [ 00:17:41.107 { 00:17:41.107 "dma_device_id": "system", 00:17:41.107 "dma_device_type": 1 00:17:41.107 }, 00:17:41.107 { 00:17:41.107 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.107 "dma_device_type": 2 00:17:41.107 } 00:17:41.107 ], 00:17:41.107 "driver_specific": { 00:17:41.107 "passthru": { 00:17:41.107 "name": "pt1", 00:17:41.107 "base_bdev_name": "malloc1" 00:17:41.107 } 00:17:41.107 } 00:17:41.107 }' 00:17:41.107 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.107 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.366 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:41.624 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:41.624 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:41.624 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:17:41.624 15:56:46 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:41.883 "name": "pt2", 00:17:41.883 "aliases": [ 00:17:41.883 "00000000-0000-0000-0000-000000000002" 00:17:41.883 ], 00:17:41.883 "product_name": "passthru", 00:17:41.883 "block_size": 512, 00:17:41.883 "num_blocks": 65536, 00:17:41.883 "uuid": "00000000-0000-0000-0000-000000000002", 00:17:41.883 "assigned_rate_limits": { 00:17:41.883 "rw_ios_per_sec": 0, 00:17:41.883 "rw_mbytes_per_sec": 0, 00:17:41.883 "r_mbytes_per_sec": 0, 00:17:41.883 "w_mbytes_per_sec": 0 00:17:41.883 }, 00:17:41.883 "claimed": true, 00:17:41.883 "claim_type": "exclusive_write", 00:17:41.883 "zoned": false, 00:17:41.883 "supported_io_types": { 00:17:41.883 "read": true, 00:17:41.883 "write": true, 00:17:41.883 "unmap": true, 00:17:41.883 "write_zeroes": true, 00:17:41.883 "flush": true, 00:17:41.883 "reset": true, 00:17:41.883 "compare": false, 00:17:41.883 "compare_and_write": false, 00:17:41.883 "abort": true, 00:17:41.883 "nvme_admin": false, 00:17:41.883 "nvme_io": false 00:17:41.883 }, 00:17:41.883 "memory_domains": [ 00:17:41.883 { 00:17:41.883 "dma_device_id": "system", 00:17:41.883 "dma_device_type": 1 00:17:41.883 }, 00:17:41.883 { 00:17:41.883 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:41.883 "dma_device_type": 2 00:17:41.883 } 00:17:41.883 ], 00:17:41.883 "driver_specific": { 00:17:41.883 "passthru": { 00:17:41.883 "name": "pt2", 00:17:41.883 "base_bdev_name": "malloc2" 00:17:41.883 } 00:17:41.883 } 00:17:41.883 }' 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:41.883 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:17:42.141 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.399 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.399 "name": "pt3", 00:17:42.399 "aliases": [ 00:17:42.399 "00000000-0000-0000-0000-000000000003" 00:17:42.399 ], 00:17:42.399 "product_name": "passthru", 00:17:42.399 "block_size": 512, 00:17:42.399 "num_blocks": 65536, 00:17:42.399 "uuid": "00000000-0000-0000-0000-000000000003", 00:17:42.399 "assigned_rate_limits": { 00:17:42.399 "rw_ios_per_sec": 0, 00:17:42.399 "rw_mbytes_per_sec": 0, 00:17:42.399 "r_mbytes_per_sec": 0, 00:17:42.399 "w_mbytes_per_sec": 0 00:17:42.399 }, 00:17:42.399 "claimed": true, 00:17:42.399 "claim_type": "exclusive_write", 00:17:42.399 "zoned": false, 00:17:42.399 "supported_io_types": { 00:17:42.399 "read": true, 00:17:42.399 "write": true, 00:17:42.399 "unmap": true, 00:17:42.399 "write_zeroes": true, 00:17:42.399 "flush": true, 00:17:42.399 "reset": true, 00:17:42.399 "compare": false, 00:17:42.399 "compare_and_write": false, 00:17:42.399 "abort": true, 00:17:42.399 "nvme_admin": false, 00:17:42.399 "nvme_io": false 00:17:42.399 }, 00:17:42.399 "memory_domains": [ 00:17:42.399 { 00:17:42.399 "dma_device_id": "system", 00:17:42.399 "dma_device_type": 1 00:17:42.399 }, 00:17:42.399 { 00:17:42.399 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.399 "dma_device_type": 2 00:17:42.399 } 00:17:42.399 ], 00:17:42.399 "driver_specific": { 00:17:42.399 "passthru": { 00:17:42.399 "name": "pt3", 00:17:42.399 "base_bdev_name": "malloc3" 00:17:42.399 } 00:17:42.399 } 00:17:42.399 }' 00:17:42.399 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.399 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:42.399 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:42.399 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.657 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:42.657 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:42.657 15:56:47 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:17:42.657 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:17:42.916 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:17:42.916 "name": "pt4", 00:17:42.916 "aliases": [ 00:17:42.916 "00000000-0000-0000-0000-000000000004" 00:17:42.916 ], 00:17:42.916 "product_name": "passthru", 00:17:42.916 "block_size": 512, 00:17:42.916 "num_blocks": 65536, 00:17:42.916 "uuid": "00000000-0000-0000-0000-000000000004", 00:17:42.916 "assigned_rate_limits": { 00:17:42.916 "rw_ios_per_sec": 0, 00:17:42.916 "rw_mbytes_per_sec": 0, 00:17:42.916 "r_mbytes_per_sec": 0, 00:17:42.916 "w_mbytes_per_sec": 0 00:17:42.916 }, 00:17:42.916 "claimed": true, 00:17:42.916 "claim_type": "exclusive_write", 00:17:42.916 "zoned": false, 00:17:42.916 "supported_io_types": { 00:17:42.916 "read": true, 00:17:42.916 "write": true, 00:17:42.916 "unmap": true, 00:17:42.916 "write_zeroes": true, 00:17:42.916 "flush": true, 00:17:42.916 "reset": true, 00:17:42.916 "compare": false, 00:17:42.916 "compare_and_write": false, 00:17:42.916 "abort": true, 00:17:42.916 "nvme_admin": false, 00:17:42.916 "nvme_io": false 00:17:42.916 }, 00:17:42.916 "memory_domains": [ 00:17:42.916 { 00:17:42.916 "dma_device_id": "system", 00:17:42.916 "dma_device_type": 1 00:17:42.916 }, 00:17:42.916 { 00:17:42.916 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:17:42.916 "dma_device_type": 2 00:17:42.916 } 00:17:42.916 ], 00:17:42.916 "driver_specific": { 00:17:42.916 "passthru": { 00:17:42.916 "name": "pt4", 00:17:42.916 "base_bdev_name": "malloc4" 00:17:42.916 } 00:17:42.916 } 00:17:42.916 }' 00:17:42.916 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:17:43.176 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.435 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:17:43.435 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:17:43.435 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:17:43.435 15:56:48 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:17:43.693 [2024-06-10 15:56:49.004527] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 166817a2-6fb4-4597-b908-d48c8bf16265 '!=' 166817a2-6fb4-4597-b908-d48c8bf16265 ']' 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid0 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2718506 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2718506 ']' 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2718506 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2718506 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2718506' 00:17:43.693 killing process with pid 2718506 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2718506 00:17:43.693 [2024-06-10 15:56:49.053751] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:43.693 [2024-06-10 15:56:49.053815] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:43.693 [2024-06-10 15:56:49.053886] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:43.693 [2024-06-10 15:56:49.053896] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1258e70 name raid_bdev1, state offline 00:17:43.693 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2718506 00:17:43.693 [2024-06-10 15:56:49.087460] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:43.952 15:56:49 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:17:43.952 00:17:43.952 real 0m16.984s 00:17:43.952 user 0m31.405s 00:17:43.952 sys 0m2.362s 00:17:43.952 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:43.952 15:56:49 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.952 ************************************ 00:17:43.952 END TEST raid_superblock_test 00:17:43.952 ************************************ 00:17:43.952 15:56:49 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid0 4 read 00:17:43.952 15:56:49 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:43.952 15:56:49 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:43.952 15:56:49 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:43.952 ************************************ 00:17:43.952 START TEST raid_read_error_test 00:17:43.952 ************************************ 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 read 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:43.952 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.ec06Hz0IGe 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2721724 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2721724 /var/tmp/spdk-raid.sock 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2721724 ']' 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:43.953 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:43.953 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:43.953 [2024-06-10 15:56:49.418031] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:17:43.953 [2024-06-10 15:56:49.418083] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2721724 ] 00:17:44.211 [2024-06-10 15:56:49.533760] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:44.211 [2024-06-10 15:56:49.662413] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:44.211 [2024-06-10 15:56:49.720962] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:44.211 [2024-06-10 15:56:49.720992] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:44.469 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:44.469 15:56:49 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:44.469 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:44.469 15:56:49 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:44.728 BaseBdev1_malloc 00:17:44.728 15:56:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:44.986 true 00:17:44.987 15:56:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:45.245 [2024-06-10 15:56:50.520737] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:45.245 [2024-06-10 15:56:50.520777] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:45.245 [2024-06-10 15:56:50.520795] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf16150 00:17:45.245 [2024-06-10 15:56:50.520804] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:45.245 [2024-06-10 15:56:50.522650] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:45.245 [2024-06-10 15:56:50.522677] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:45.245 BaseBdev1 00:17:45.245 15:56:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:45.245 15:56:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:45.504 BaseBdev2_malloc 00:17:45.504 15:56:50 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:45.763 true 00:17:45.763 15:56:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:46.022 [2024-06-10 15:56:51.287382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:46.022 [2024-06-10 15:56:51.287422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.022 [2024-06-10 15:56:51.287439] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1ab50 00:17:46.022 [2024-06-10 15:56:51.287449] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.022 [2024-06-10 15:56:51.289063] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.022 [2024-06-10 15:56:51.289088] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:46.022 BaseBdev2 00:17:46.022 15:56:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.022 15:56:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:46.281 BaseBdev3_malloc 00:17:46.281 15:56:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:46.540 true 00:17:46.540 15:56:51 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:46.798 [2024-06-10 15:56:52.053868] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:46.798 [2024-06-10 15:56:52.053906] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:46.798 [2024-06-10 15:56:52.053924] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf1b780 00:17:46.798 [2024-06-10 15:56:52.053933] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:46.798 [2024-06-10 15:56:52.055546] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:46.798 [2024-06-10 15:56:52.055572] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:46.798 BaseBdev3 00:17:46.798 15:56:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:46.798 15:56:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:46.798 BaseBdev4_malloc 00:17:47.057 15:56:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:47.057 true 00:17:47.315 15:56:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:47.315 [2024-06-10 15:56:52.812286] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:47.315 [2024-06-10 15:56:52.812324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:47.315 [2024-06-10 15:56:52.812344] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf14ee0 00:17:47.315 [2024-06-10 15:56:52.812353] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:47.315 [2024-06-10 15:56:52.813990] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:47.315 [2024-06-10 15:56:52.814022] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:47.315 BaseBdev4 00:17:47.574 15:56:52 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:47.574 [2024-06-10 15:56:53.056966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:47.574 [2024-06-10 15:56:53.058273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:47.574 [2024-06-10 15:56:53.058342] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:47.574 [2024-06-10 15:56:53.058406] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:47.574 [2024-06-10 15:56:53.058638] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf1e5f0 00:17:47.574 [2024-06-10 15:56:53.058649] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:47.574 [2024-06-10 15:56:53.058836] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xd65510 00:17:47.574 [2024-06-10 15:56:53.059001] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf1e5f0 00:17:47.574 [2024-06-10 15:56:53.059011] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf1e5f0 00:17:47.574 [2024-06-10 15:56:53.059117] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:47.574 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:48.141 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:48.141 "name": "raid_bdev1", 00:17:48.141 "uuid": "267ba953-cdd1-48b4-bfa2-0a0f416ec128", 00:17:48.141 "strip_size_kb": 64, 00:17:48.141 "state": "online", 00:17:48.141 "raid_level": "raid0", 00:17:48.141 "superblock": true, 00:17:48.141 "num_base_bdevs": 4, 00:17:48.141 "num_base_bdevs_discovered": 4, 00:17:48.141 "num_base_bdevs_operational": 4, 00:17:48.141 "base_bdevs_list": [ 00:17:48.141 { 00:17:48.141 "name": "BaseBdev1", 00:17:48.141 "uuid": "025aab4d-de67-59aa-a9b7-1a3dd9ca4c08", 00:17:48.141 "is_configured": true, 00:17:48.141 "data_offset": 2048, 00:17:48.141 "data_size": 63488 00:17:48.141 }, 00:17:48.141 { 00:17:48.141 "name": "BaseBdev2", 00:17:48.141 "uuid": "7a832d33-53fd-5dc5-bde9-1d169e6cd971", 00:17:48.141 "is_configured": true, 00:17:48.141 "data_offset": 2048, 00:17:48.141 "data_size": 63488 00:17:48.141 }, 00:17:48.141 { 00:17:48.141 "name": "BaseBdev3", 00:17:48.141 "uuid": "95a49555-4b42-5057-83dd-66c3a1d02c85", 00:17:48.141 "is_configured": true, 00:17:48.141 "data_offset": 2048, 00:17:48.141 "data_size": 63488 00:17:48.141 }, 00:17:48.141 { 00:17:48.141 "name": "BaseBdev4", 00:17:48.141 "uuid": "0acf2f3b-c09a-55ef-8124-aab7b917ebaa", 00:17:48.141 "is_configured": true, 00:17:48.141 "data_offset": 2048, 00:17:48.141 "data_size": 63488 00:17:48.141 } 00:17:48.141 ] 00:17:48.141 }' 00:17:48.141 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:48.141 15:56:53 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:48.400 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:48.400 15:56:53 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:48.658 [2024-06-10 15:56:54.007759] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf174a0 00:17:49.596 15:56:54 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:49.854 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:50.113 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:50.113 "name": "raid_bdev1", 00:17:50.113 "uuid": "267ba953-cdd1-48b4-bfa2-0a0f416ec128", 00:17:50.113 "strip_size_kb": 64, 00:17:50.113 "state": "online", 00:17:50.114 "raid_level": "raid0", 00:17:50.114 "superblock": true, 00:17:50.114 "num_base_bdevs": 4, 00:17:50.114 "num_base_bdevs_discovered": 4, 00:17:50.114 "num_base_bdevs_operational": 4, 00:17:50.114 "base_bdevs_list": [ 00:17:50.114 { 00:17:50.114 "name": "BaseBdev1", 00:17:50.114 "uuid": "025aab4d-de67-59aa-a9b7-1a3dd9ca4c08", 00:17:50.114 "is_configured": true, 00:17:50.114 "data_offset": 2048, 00:17:50.114 "data_size": 63488 00:17:50.114 }, 00:17:50.114 { 00:17:50.114 "name": "BaseBdev2", 00:17:50.114 "uuid": "7a832d33-53fd-5dc5-bde9-1d169e6cd971", 00:17:50.114 "is_configured": true, 00:17:50.114 "data_offset": 2048, 00:17:50.114 "data_size": 63488 00:17:50.114 }, 00:17:50.114 { 00:17:50.114 "name": "BaseBdev3", 00:17:50.114 "uuid": "95a49555-4b42-5057-83dd-66c3a1d02c85", 00:17:50.114 "is_configured": true, 00:17:50.114 "data_offset": 2048, 00:17:50.114 "data_size": 63488 00:17:50.114 }, 00:17:50.114 { 00:17:50.114 "name": "BaseBdev4", 00:17:50.114 "uuid": "0acf2f3b-c09a-55ef-8124-aab7b917ebaa", 00:17:50.114 "is_configured": true, 00:17:50.114 "data_offset": 2048, 00:17:50.114 "data_size": 63488 00:17:50.114 } 00:17:50.114 ] 00:17:50.114 }' 00:17:50.114 15:56:55 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:50.114 15:56:55 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:50.681 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:50.941 [2024-06-10 15:56:56.201917] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:50.941 [2024-06-10 15:56:56.201950] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:50.941 [2024-06-10 15:56:56.205374] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:50.941 [2024-06-10 15:56:56.205413] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:50.941 [2024-06-10 15:56:56.205451] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:50.941 [2024-06-10 15:56:56.205465] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf1e5f0 name raid_bdev1, state offline 00:17:50.941 0 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2721724 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2721724 ']' 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2721724 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2721724 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2721724' 00:17:50.941 killing process with pid 2721724 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2721724 00:17:50.941 [2024-06-10 15:56:56.268928] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:50.941 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2721724 00:17:50.941 [2024-06-10 15:56:56.297573] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.ec06Hz0IGe 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.46 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.46 != \0\.\0\0 ]] 00:17:51.200 00:17:51.200 real 0m7.162s 00:17:51.200 user 0m12.053s 00:17:51.200 sys 0m1.059s 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:51.200 15:56:56 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.200 ************************************ 00:17:51.200 END TEST raid_read_error_test 00:17:51.200 ************************************ 00:17:51.200 15:56:56 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid0 4 write 00:17:51.200 15:56:56 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:51.200 15:56:56 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:51.200 15:56:56 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:51.200 ************************************ 00:17:51.200 START TEST raid_write_error_test 00:17:51.200 ************************************ 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid0 4 write 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid0 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:51.200 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid0 '!=' raid1 ']' 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.hWn7WaVSyC 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2722958 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2722958 /var/tmp/spdk-raid.sock 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2722958 ']' 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:51.201 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:51.201 15:56:56 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:51.201 [2024-06-10 15:56:56.652194] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:17:51.201 [2024-06-10 15:56:56.652250] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2722958 ] 00:17:51.460 [2024-06-10 15:56:56.752100] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:51.460 [2024-06-10 15:56:56.846179] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:51.460 [2024-06-10 15:56:56.900051] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:51.460 [2024-06-10 15:56:56.900079] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:52.397 15:56:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:52.397 15:56:57 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:17:52.397 15:56:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.397 15:56:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:17:52.397 BaseBdev1_malloc 00:17:52.397 15:56:57 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:17:52.656 true 00:17:52.656 15:56:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:17:52.914 [2024-06-10 15:56:58.365787] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:17:52.914 [2024-06-10 15:56:58.365831] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:52.914 [2024-06-10 15:56:58.365849] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1869150 00:17:52.914 [2024-06-10 15:56:58.365858] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:52.914 [2024-06-10 15:56:58.367624] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:52.914 [2024-06-10 15:56:58.367653] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:17:52.914 BaseBdev1 00:17:52.914 15:56:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:52.914 15:56:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:17:53.172 BaseBdev2_malloc 00:17:53.172 15:56:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:17:53.431 true 00:17:53.431 15:56:58 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:17:53.689 [2024-06-10 15:56:59.060165] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:17:53.689 [2024-06-10 15:56:59.060205] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:53.689 [2024-06-10 15:56:59.060222] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x186db50 00:17:53.689 [2024-06-10 15:56:59.060231] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:53.689 [2024-06-10 15:56:59.061738] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:53.689 [2024-06-10 15:56:59.061764] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:17:53.689 BaseBdev2 00:17:53.689 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:53.689 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:17:53.948 BaseBdev3_malloc 00:17:53.948 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:17:54.206 true 00:17:54.206 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:17:54.464 [2024-06-10 15:56:59.754350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:17:54.464 [2024-06-10 15:56:59.754389] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:54.464 [2024-06-10 15:56:59.754404] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x186e780 00:17:54.464 [2024-06-10 15:56:59.754414] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:54.464 [2024-06-10 15:56:59.755859] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:54.464 [2024-06-10 15:56:59.755884] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:17:54.464 BaseBdev3 00:17:54.464 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:17:54.464 15:56:59 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:17:54.722 BaseBdev4_malloc 00:17:54.722 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:17:54.981 true 00:17:54.981 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:17:55.239 [2024-06-10 15:57:00.532825] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:17:55.239 [2024-06-10 15:57:00.532863] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:17:55.239 [2024-06-10 15:57:00.532883] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1867ee0 00:17:55.239 [2024-06-10 15:57:00.532893] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:17:55.239 [2024-06-10 15:57:00.534426] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:17:55.239 [2024-06-10 15:57:00.534452] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:17:55.239 BaseBdev4 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r raid0 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:17:55.239 [2024-06-10 15:57:00.713341] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:17:55.239 [2024-06-10 15:57:00.714670] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:17:55.239 [2024-06-10 15:57:00.714741] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:17:55.239 [2024-06-10 15:57:00.714804] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:17:55.239 [2024-06-10 15:57:00.715055] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18715f0 00:17:55.239 [2024-06-10 15:57:00.715067] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:17:55.239 [2024-06-10 15:57:00.715261] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x16b8510 00:17:55.239 [2024-06-10 15:57:00.715419] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18715f0 00:17:55.239 [2024-06-10 15:57:00.715428] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18715f0 00:17:55.239 [2024-06-10 15:57:00.715531] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:55.239 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:55.240 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:55.499 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:55.499 "name": "raid_bdev1", 00:17:55.499 "uuid": "06f5c4e2-ba45-41f9-bfa4-2d6ce7fbda62", 00:17:55.499 "strip_size_kb": 64, 00:17:55.499 "state": "online", 00:17:55.499 "raid_level": "raid0", 00:17:55.499 "superblock": true, 00:17:55.499 "num_base_bdevs": 4, 00:17:55.499 "num_base_bdevs_discovered": 4, 00:17:55.499 "num_base_bdevs_operational": 4, 00:17:55.499 "base_bdevs_list": [ 00:17:55.499 { 00:17:55.499 "name": "BaseBdev1", 00:17:55.499 "uuid": "aec88c7b-bd2b-50cf-802d-b31d62d51002", 00:17:55.499 "is_configured": true, 00:17:55.499 "data_offset": 2048, 00:17:55.499 "data_size": 63488 00:17:55.499 }, 00:17:55.499 { 00:17:55.499 "name": "BaseBdev2", 00:17:55.499 "uuid": "1e7f1511-bfa9-57c6-b55d-60989f14768b", 00:17:55.499 "is_configured": true, 00:17:55.499 "data_offset": 2048, 00:17:55.499 "data_size": 63488 00:17:55.499 }, 00:17:55.499 { 00:17:55.499 "name": "BaseBdev3", 00:17:55.499 "uuid": "f59a36a3-70db-584b-b397-a56ae3a97cb4", 00:17:55.499 "is_configured": true, 00:17:55.499 "data_offset": 2048, 00:17:55.499 "data_size": 63488 00:17:55.499 }, 00:17:55.499 { 00:17:55.499 "name": "BaseBdev4", 00:17:55.499 "uuid": "ecc0cf3e-779c-5d19-b602-01f7a864ec27", 00:17:55.499 "is_configured": true, 00:17:55.499 "data_offset": 2048, 00:17:55.499 "data_size": 63488 00:17:55.499 } 00:17:55.499 ] 00:17:55.499 }' 00:17:55.499 15:57:00 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:55.499 15:57:00 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:56.066 15:57:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:17:56.066 15:57:01 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:17:56.325 [2024-06-10 15:57:01.603967] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x186a4a0 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid0 = \r\a\i\d\1 ]] 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid0 64 4 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid0 00:17:57.366 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:57.367 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:17:57.636 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:17:57.636 "name": "raid_bdev1", 00:17:57.636 "uuid": "06f5c4e2-ba45-41f9-bfa4-2d6ce7fbda62", 00:17:57.636 "strip_size_kb": 64, 00:17:57.636 "state": "online", 00:17:57.636 "raid_level": "raid0", 00:17:57.636 "superblock": true, 00:17:57.636 "num_base_bdevs": 4, 00:17:57.636 "num_base_bdevs_discovered": 4, 00:17:57.636 "num_base_bdevs_operational": 4, 00:17:57.636 "base_bdevs_list": [ 00:17:57.636 { 00:17:57.636 "name": "BaseBdev1", 00:17:57.636 "uuid": "aec88c7b-bd2b-50cf-802d-b31d62d51002", 00:17:57.636 "is_configured": true, 00:17:57.636 "data_offset": 2048, 00:17:57.636 "data_size": 63488 00:17:57.636 }, 00:17:57.636 { 00:17:57.636 "name": "BaseBdev2", 00:17:57.636 "uuid": "1e7f1511-bfa9-57c6-b55d-60989f14768b", 00:17:57.636 "is_configured": true, 00:17:57.636 "data_offset": 2048, 00:17:57.636 "data_size": 63488 00:17:57.636 }, 00:17:57.636 { 00:17:57.636 "name": "BaseBdev3", 00:17:57.636 "uuid": "f59a36a3-70db-584b-b397-a56ae3a97cb4", 00:17:57.636 "is_configured": true, 00:17:57.636 "data_offset": 2048, 00:17:57.636 "data_size": 63488 00:17:57.636 }, 00:17:57.637 { 00:17:57.637 "name": "BaseBdev4", 00:17:57.637 "uuid": "ecc0cf3e-779c-5d19-b602-01f7a864ec27", 00:17:57.637 "is_configured": true, 00:17:57.637 "data_offset": 2048, 00:17:57.637 "data_size": 63488 00:17:57.637 } 00:17:57.637 ] 00:17:57.637 }' 00:17:57.637 15:57:02 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:17:57.637 15:57:02 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:17:58.204 [2024-06-10 15:57:03.664775] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:17:58.204 [2024-06-10 15:57:03.664815] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:17:58.204 [2024-06-10 15:57:03.668247] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:17:58.204 [2024-06-10 15:57:03.668285] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:17:58.204 [2024-06-10 15:57:03.668324] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:17:58.204 [2024-06-10 15:57:03.668332] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18715f0 name raid_bdev1, state offline 00:17:58.204 0 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2722958 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2722958 ']' 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2722958 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:17:58.204 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2722958 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2722958' 00:17:58.464 killing process with pid 2722958 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2722958 00:17:58.464 [2024-06-10 15:57:03.726111] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2722958 00:17:58.464 [2024-06-10 15:57:03.754169] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.hWn7WaVSyC 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.49 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid0 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.49 != \0\.\0\0 ]] 00:17:58.464 00:17:58.464 real 0m7.393s 00:17:58.464 user 0m11.987s 00:17:58.464 sys 0m1.076s 00:17:58.464 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:17:58.723 15:57:03 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.723 ************************************ 00:17:58.723 END TEST raid_write_error_test 00:17:58.723 ************************************ 00:17:58.723 15:57:04 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:17:58.723 15:57:04 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test concat 4 false 00:17:58.723 15:57:04 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:17:58.723 15:57:04 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:17:58.723 15:57:04 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:17:58.723 ************************************ 00:17:58.723 START TEST raid_state_function_test 00:17:58.723 ************************************ 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 false 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2724319 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2724319' 00:17:58.723 Process raid pid: 2724319 00:17:58.723 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2724319 /var/tmp/spdk-raid.sock 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2724319 ']' 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:17:58.724 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:17:58.724 15:57:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:17:58.724 [2024-06-10 15:57:04.110875] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:17:58.724 [2024-06-10 15:57:04.110930] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:17:58.724 [2024-06-10 15:57:04.209549] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:17:58.983 [2024-06-10 15:57:04.304249] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:17:58.983 [2024-06-10 15:57:04.359061] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:58.983 [2024-06-10 15:57:04.359087] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:17:59.919 [2024-06-10 15:57:05.301818] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:17:59.919 [2024-06-10 15:57:05.301857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:17:59.919 [2024-06-10 15:57:05.301866] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:17:59.919 [2024-06-10 15:57:05.301875] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:17:59.919 [2024-06-10 15:57:05.301881] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:17:59.919 [2024-06-10 15:57:05.301889] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:17:59.919 [2024-06-10 15:57:05.301896] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:17:59.919 [2024-06-10 15:57:05.301904] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:17:59.919 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:00.177 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:00.177 "name": "Existed_Raid", 00:18:00.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.177 "strip_size_kb": 64, 00:18:00.177 "state": "configuring", 00:18:00.177 "raid_level": "concat", 00:18:00.177 "superblock": false, 00:18:00.177 "num_base_bdevs": 4, 00:18:00.177 "num_base_bdevs_discovered": 0, 00:18:00.177 "num_base_bdevs_operational": 4, 00:18:00.177 "base_bdevs_list": [ 00:18:00.177 { 00:18:00.177 "name": "BaseBdev1", 00:18:00.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.177 "is_configured": false, 00:18:00.177 "data_offset": 0, 00:18:00.177 "data_size": 0 00:18:00.177 }, 00:18:00.177 { 00:18:00.177 "name": "BaseBdev2", 00:18:00.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.177 "is_configured": false, 00:18:00.177 "data_offset": 0, 00:18:00.177 "data_size": 0 00:18:00.177 }, 00:18:00.177 { 00:18:00.177 "name": "BaseBdev3", 00:18:00.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.177 "is_configured": false, 00:18:00.177 "data_offset": 0, 00:18:00.177 "data_size": 0 00:18:00.177 }, 00:18:00.177 { 00:18:00.177 "name": "BaseBdev4", 00:18:00.177 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:00.177 "is_configured": false, 00:18:00.177 "data_offset": 0, 00:18:00.177 "data_size": 0 00:18:00.177 } 00:18:00.177 ] 00:18:00.177 }' 00:18:00.177 15:57:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:00.177 15:57:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:00.745 15:57:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:01.003 [2024-06-10 15:57:06.360494] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:01.003 [2024-06-10 15:57:06.360522] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc52140 name Existed_Raid, state configuring 00:18:01.003 15:57:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:01.262 [2024-06-10 15:57:06.613183] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:01.262 [2024-06-10 15:57:06.613207] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:01.262 [2024-06-10 15:57:06.613214] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:01.262 [2024-06-10 15:57:06.613222] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:01.262 [2024-06-10 15:57:06.613229] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:01.262 [2024-06-10 15:57:06.613237] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:01.262 [2024-06-10 15:57:06.613244] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:01.262 [2024-06-10 15:57:06.613252] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:01.262 15:57:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:01.520 [2024-06-10 15:57:06.799347] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:01.520 BaseBdev1 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:01.520 15:57:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:01.779 [ 00:18:01.779 { 00:18:01.779 "name": "BaseBdev1", 00:18:01.779 "aliases": [ 00:18:01.779 "c3193f3b-490e-43b9-ab5d-def605e7e19f" 00:18:01.779 ], 00:18:01.779 "product_name": "Malloc disk", 00:18:01.779 "block_size": 512, 00:18:01.779 "num_blocks": 65536, 00:18:01.779 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:01.779 "assigned_rate_limits": { 00:18:01.779 "rw_ios_per_sec": 0, 00:18:01.779 "rw_mbytes_per_sec": 0, 00:18:01.779 "r_mbytes_per_sec": 0, 00:18:01.779 "w_mbytes_per_sec": 0 00:18:01.779 }, 00:18:01.779 "claimed": true, 00:18:01.779 "claim_type": "exclusive_write", 00:18:01.779 "zoned": false, 00:18:01.779 "supported_io_types": { 00:18:01.779 "read": true, 00:18:01.779 "write": true, 00:18:01.779 "unmap": true, 00:18:01.780 "write_zeroes": true, 00:18:01.780 "flush": true, 00:18:01.780 "reset": true, 00:18:01.780 "compare": false, 00:18:01.780 "compare_and_write": false, 00:18:01.780 "abort": true, 00:18:01.780 "nvme_admin": false, 00:18:01.780 "nvme_io": false 00:18:01.780 }, 00:18:01.780 "memory_domains": [ 00:18:01.780 { 00:18:01.780 "dma_device_id": "system", 00:18:01.780 "dma_device_type": 1 00:18:01.780 }, 00:18:01.780 { 00:18:01.780 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:01.780 "dma_device_type": 2 00:18:01.780 } 00:18:01.780 ], 00:18:01.780 "driver_specific": {} 00:18:01.780 } 00:18:01.780 ] 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:01.780 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:02.039 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:02.039 "name": "Existed_Raid", 00:18:02.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.039 "strip_size_kb": 64, 00:18:02.039 "state": "configuring", 00:18:02.039 "raid_level": "concat", 00:18:02.039 "superblock": false, 00:18:02.039 "num_base_bdevs": 4, 00:18:02.039 "num_base_bdevs_discovered": 1, 00:18:02.039 "num_base_bdevs_operational": 4, 00:18:02.039 "base_bdevs_list": [ 00:18:02.039 { 00:18:02.039 "name": "BaseBdev1", 00:18:02.039 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:02.039 "is_configured": true, 00:18:02.039 "data_offset": 0, 00:18:02.039 "data_size": 65536 00:18:02.039 }, 00:18:02.039 { 00:18:02.039 "name": "BaseBdev2", 00:18:02.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.039 "is_configured": false, 00:18:02.039 "data_offset": 0, 00:18:02.039 "data_size": 0 00:18:02.039 }, 00:18:02.039 { 00:18:02.039 "name": "BaseBdev3", 00:18:02.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.039 "is_configured": false, 00:18:02.039 "data_offset": 0, 00:18:02.039 "data_size": 0 00:18:02.039 }, 00:18:02.039 { 00:18:02.039 "name": "BaseBdev4", 00:18:02.039 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:02.039 "is_configured": false, 00:18:02.039 "data_offset": 0, 00:18:02.039 "data_size": 0 00:18:02.039 } 00:18:02.039 ] 00:18:02.039 }' 00:18:02.039 15:57:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:02.039 15:57:07 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:02.606 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:02.865 [2024-06-10 15:57:08.199083] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:02.865 [2024-06-10 15:57:08.199116] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc519b0 name Existed_Raid, state configuring 00:18:02.865 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:03.124 [2024-06-10 15:57:08.379599] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:03.124 [2024-06-10 15:57:08.381096] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:03.124 [2024-06-10 15:57:08.381124] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:03.124 [2024-06-10 15:57:08.381133] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:03.124 [2024-06-10 15:57:08.381142] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:03.124 [2024-06-10 15:57:08.381149] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:03.124 [2024-06-10 15:57:08.381157] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:03.124 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:03.383 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:03.383 "name": "Existed_Raid", 00:18:03.383 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.383 "strip_size_kb": 64, 00:18:03.383 "state": "configuring", 00:18:03.383 "raid_level": "concat", 00:18:03.383 "superblock": false, 00:18:03.383 "num_base_bdevs": 4, 00:18:03.383 "num_base_bdevs_discovered": 1, 00:18:03.383 "num_base_bdevs_operational": 4, 00:18:03.383 "base_bdevs_list": [ 00:18:03.384 { 00:18:03.384 "name": "BaseBdev1", 00:18:03.384 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:03.384 "is_configured": true, 00:18:03.384 "data_offset": 0, 00:18:03.384 "data_size": 65536 00:18:03.384 }, 00:18:03.384 { 00:18:03.384 "name": "BaseBdev2", 00:18:03.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.384 "is_configured": false, 00:18:03.384 "data_offset": 0, 00:18:03.384 "data_size": 0 00:18:03.384 }, 00:18:03.384 { 00:18:03.384 "name": "BaseBdev3", 00:18:03.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.384 "is_configured": false, 00:18:03.384 "data_offset": 0, 00:18:03.384 "data_size": 0 00:18:03.384 }, 00:18:03.384 { 00:18:03.384 "name": "BaseBdev4", 00:18:03.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:03.384 "is_configured": false, 00:18:03.384 "data_offset": 0, 00:18:03.384 "data_size": 0 00:18:03.384 } 00:18:03.384 ] 00:18:03.384 }' 00:18:03.384 15:57:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:03.384 15:57:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:03.954 15:57:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:04.214 [2024-06-10 15:57:09.533943] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:04.214 BaseBdev2 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:04.214 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:04.473 15:57:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:04.732 [ 00:18:04.732 { 00:18:04.732 "name": "BaseBdev2", 00:18:04.732 "aliases": [ 00:18:04.732 "c2a249df-14d0-4570-8bbb-723185963338" 00:18:04.732 ], 00:18:04.732 "product_name": "Malloc disk", 00:18:04.732 "block_size": 512, 00:18:04.732 "num_blocks": 65536, 00:18:04.732 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:04.732 "assigned_rate_limits": { 00:18:04.732 "rw_ios_per_sec": 0, 00:18:04.732 "rw_mbytes_per_sec": 0, 00:18:04.732 "r_mbytes_per_sec": 0, 00:18:04.732 "w_mbytes_per_sec": 0 00:18:04.732 }, 00:18:04.732 "claimed": true, 00:18:04.732 "claim_type": "exclusive_write", 00:18:04.732 "zoned": false, 00:18:04.732 "supported_io_types": { 00:18:04.732 "read": true, 00:18:04.732 "write": true, 00:18:04.732 "unmap": true, 00:18:04.732 "write_zeroes": true, 00:18:04.732 "flush": true, 00:18:04.732 "reset": true, 00:18:04.732 "compare": false, 00:18:04.732 "compare_and_write": false, 00:18:04.732 "abort": true, 00:18:04.732 "nvme_admin": false, 00:18:04.732 "nvme_io": false 00:18:04.732 }, 00:18:04.732 "memory_domains": [ 00:18:04.732 { 00:18:04.732 "dma_device_id": "system", 00:18:04.732 "dma_device_type": 1 00:18:04.732 }, 00:18:04.732 { 00:18:04.732 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:04.732 "dma_device_type": 2 00:18:04.732 } 00:18:04.732 ], 00:18:04.732 "driver_specific": {} 00:18:04.732 } 00:18:04.732 ] 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:04.732 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:04.733 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:04.991 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:04.991 "name": "Existed_Raid", 00:18:04.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.991 "strip_size_kb": 64, 00:18:04.991 "state": "configuring", 00:18:04.991 "raid_level": "concat", 00:18:04.991 "superblock": false, 00:18:04.991 "num_base_bdevs": 4, 00:18:04.991 "num_base_bdevs_discovered": 2, 00:18:04.991 "num_base_bdevs_operational": 4, 00:18:04.991 "base_bdevs_list": [ 00:18:04.991 { 00:18:04.991 "name": "BaseBdev1", 00:18:04.991 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:04.991 "is_configured": true, 00:18:04.991 "data_offset": 0, 00:18:04.991 "data_size": 65536 00:18:04.991 }, 00:18:04.991 { 00:18:04.991 "name": "BaseBdev2", 00:18:04.991 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:04.991 "is_configured": true, 00:18:04.991 "data_offset": 0, 00:18:04.991 "data_size": 65536 00:18:04.991 }, 00:18:04.991 { 00:18:04.991 "name": "BaseBdev3", 00:18:04.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.991 "is_configured": false, 00:18:04.991 "data_offset": 0, 00:18:04.991 "data_size": 0 00:18:04.991 }, 00:18:04.991 { 00:18:04.991 "name": "BaseBdev4", 00:18:04.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:04.991 "is_configured": false, 00:18:04.991 "data_offset": 0, 00:18:04.991 "data_size": 0 00:18:04.991 } 00:18:04.991 ] 00:18:04.991 }' 00:18:04.991 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:04.991 15:57:10 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:05.559 15:57:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:05.818 [2024-06-10 15:57:11.185671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:05.818 BaseBdev3 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:05.818 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:06.076 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:06.335 [ 00:18:06.335 { 00:18:06.335 "name": "BaseBdev3", 00:18:06.335 "aliases": [ 00:18:06.335 "f82ef620-31cc-403f-bafe-fdeba0f103ef" 00:18:06.335 ], 00:18:06.335 "product_name": "Malloc disk", 00:18:06.335 "block_size": 512, 00:18:06.335 "num_blocks": 65536, 00:18:06.335 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:06.335 "assigned_rate_limits": { 00:18:06.335 "rw_ios_per_sec": 0, 00:18:06.335 "rw_mbytes_per_sec": 0, 00:18:06.335 "r_mbytes_per_sec": 0, 00:18:06.335 "w_mbytes_per_sec": 0 00:18:06.335 }, 00:18:06.335 "claimed": true, 00:18:06.335 "claim_type": "exclusive_write", 00:18:06.335 "zoned": false, 00:18:06.335 "supported_io_types": { 00:18:06.335 "read": true, 00:18:06.335 "write": true, 00:18:06.335 "unmap": true, 00:18:06.335 "write_zeroes": true, 00:18:06.335 "flush": true, 00:18:06.335 "reset": true, 00:18:06.335 "compare": false, 00:18:06.335 "compare_and_write": false, 00:18:06.335 "abort": true, 00:18:06.335 "nvme_admin": false, 00:18:06.335 "nvme_io": false 00:18:06.335 }, 00:18:06.335 "memory_domains": [ 00:18:06.335 { 00:18:06.335 "dma_device_id": "system", 00:18:06.335 "dma_device_type": 1 00:18:06.335 }, 00:18:06.335 { 00:18:06.335 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:06.335 "dma_device_type": 2 00:18:06.335 } 00:18:06.335 ], 00:18:06.335 "driver_specific": {} 00:18:06.335 } 00:18:06.335 ] 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:06.335 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:06.594 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:06.594 "name": "Existed_Raid", 00:18:06.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.594 "strip_size_kb": 64, 00:18:06.594 "state": "configuring", 00:18:06.594 "raid_level": "concat", 00:18:06.594 "superblock": false, 00:18:06.594 "num_base_bdevs": 4, 00:18:06.594 "num_base_bdevs_discovered": 3, 00:18:06.594 "num_base_bdevs_operational": 4, 00:18:06.594 "base_bdevs_list": [ 00:18:06.594 { 00:18:06.594 "name": "BaseBdev1", 00:18:06.594 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:06.594 "is_configured": true, 00:18:06.594 "data_offset": 0, 00:18:06.594 "data_size": 65536 00:18:06.594 }, 00:18:06.594 { 00:18:06.594 "name": "BaseBdev2", 00:18:06.594 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:06.594 "is_configured": true, 00:18:06.594 "data_offset": 0, 00:18:06.594 "data_size": 65536 00:18:06.594 }, 00:18:06.594 { 00:18:06.594 "name": "BaseBdev3", 00:18:06.594 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:06.594 "is_configured": true, 00:18:06.594 "data_offset": 0, 00:18:06.594 "data_size": 65536 00:18:06.594 }, 00:18:06.594 { 00:18:06.594 "name": "BaseBdev4", 00:18:06.594 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:06.594 "is_configured": false, 00:18:06.594 "data_offset": 0, 00:18:06.594 "data_size": 0 00:18:06.594 } 00:18:06.594 ] 00:18:06.594 }' 00:18:06.594 15:57:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:06.594 15:57:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:07.174 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:07.433 [2024-06-10 15:57:12.741137] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:07.433 [2024-06-10 15:57:12.741171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc52ac0 00:18:07.433 [2024-06-10 15:57:12.741178] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:07.433 [2024-06-10 15:57:12.741374] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdf7ba0 00:18:07.433 [2024-06-10 15:57:12.741501] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc52ac0 00:18:07.433 [2024-06-10 15:57:12.741509] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xc52ac0 00:18:07.433 [2024-06-10 15:57:12.741670] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:07.433 BaseBdev4 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:07.433 15:57:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:07.692 15:57:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:07.950 [ 00:18:07.950 { 00:18:07.950 "name": "BaseBdev4", 00:18:07.950 "aliases": [ 00:18:07.950 "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf" 00:18:07.950 ], 00:18:07.950 "product_name": "Malloc disk", 00:18:07.950 "block_size": 512, 00:18:07.950 "num_blocks": 65536, 00:18:07.950 "uuid": "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf", 00:18:07.950 "assigned_rate_limits": { 00:18:07.950 "rw_ios_per_sec": 0, 00:18:07.950 "rw_mbytes_per_sec": 0, 00:18:07.950 "r_mbytes_per_sec": 0, 00:18:07.950 "w_mbytes_per_sec": 0 00:18:07.950 }, 00:18:07.950 "claimed": true, 00:18:07.950 "claim_type": "exclusive_write", 00:18:07.950 "zoned": false, 00:18:07.950 "supported_io_types": { 00:18:07.950 "read": true, 00:18:07.950 "write": true, 00:18:07.950 "unmap": true, 00:18:07.950 "write_zeroes": true, 00:18:07.950 "flush": true, 00:18:07.950 "reset": true, 00:18:07.950 "compare": false, 00:18:07.950 "compare_and_write": false, 00:18:07.950 "abort": true, 00:18:07.950 "nvme_admin": false, 00:18:07.950 "nvme_io": false 00:18:07.950 }, 00:18:07.950 "memory_domains": [ 00:18:07.950 { 00:18:07.950 "dma_device_id": "system", 00:18:07.950 "dma_device_type": 1 00:18:07.950 }, 00:18:07.950 { 00:18:07.950 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:07.950 "dma_device_type": 2 00:18:07.950 } 00:18:07.950 ], 00:18:07.950 "driver_specific": {} 00:18:07.950 } 00:18:07.950 ] 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:07.951 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:08.209 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:08.209 "name": "Existed_Raid", 00:18:08.209 "uuid": "968e4fc4-2af8-49fc-8d85-dfc020bdeb3d", 00:18:08.209 "strip_size_kb": 64, 00:18:08.209 "state": "online", 00:18:08.209 "raid_level": "concat", 00:18:08.209 "superblock": false, 00:18:08.209 "num_base_bdevs": 4, 00:18:08.209 "num_base_bdevs_discovered": 4, 00:18:08.209 "num_base_bdevs_operational": 4, 00:18:08.209 "base_bdevs_list": [ 00:18:08.209 { 00:18:08.209 "name": "BaseBdev1", 00:18:08.209 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:08.209 "is_configured": true, 00:18:08.209 "data_offset": 0, 00:18:08.209 "data_size": 65536 00:18:08.209 }, 00:18:08.209 { 00:18:08.209 "name": "BaseBdev2", 00:18:08.209 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:08.209 "is_configured": true, 00:18:08.209 "data_offset": 0, 00:18:08.209 "data_size": 65536 00:18:08.209 }, 00:18:08.209 { 00:18:08.209 "name": "BaseBdev3", 00:18:08.209 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:08.209 "is_configured": true, 00:18:08.209 "data_offset": 0, 00:18:08.209 "data_size": 65536 00:18:08.209 }, 00:18:08.209 { 00:18:08.209 "name": "BaseBdev4", 00:18:08.209 "uuid": "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf", 00:18:08.209 "is_configured": true, 00:18:08.209 "data_offset": 0, 00:18:08.209 "data_size": 65536 00:18:08.209 } 00:18:08.209 ] 00:18:08.209 }' 00:18:08.209 15:57:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:08.209 15:57:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:08.777 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:09.036 [2024-06-10 15:57:14.309634] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:09.036 "name": "Existed_Raid", 00:18:09.036 "aliases": [ 00:18:09.036 "968e4fc4-2af8-49fc-8d85-dfc020bdeb3d" 00:18:09.036 ], 00:18:09.036 "product_name": "Raid Volume", 00:18:09.036 "block_size": 512, 00:18:09.036 "num_blocks": 262144, 00:18:09.036 "uuid": "968e4fc4-2af8-49fc-8d85-dfc020bdeb3d", 00:18:09.036 "assigned_rate_limits": { 00:18:09.036 "rw_ios_per_sec": 0, 00:18:09.036 "rw_mbytes_per_sec": 0, 00:18:09.036 "r_mbytes_per_sec": 0, 00:18:09.036 "w_mbytes_per_sec": 0 00:18:09.036 }, 00:18:09.036 "claimed": false, 00:18:09.036 "zoned": false, 00:18:09.036 "supported_io_types": { 00:18:09.036 "read": true, 00:18:09.036 "write": true, 00:18:09.036 "unmap": true, 00:18:09.036 "write_zeroes": true, 00:18:09.036 "flush": true, 00:18:09.036 "reset": true, 00:18:09.036 "compare": false, 00:18:09.036 "compare_and_write": false, 00:18:09.036 "abort": false, 00:18:09.036 "nvme_admin": false, 00:18:09.036 "nvme_io": false 00:18:09.036 }, 00:18:09.036 "memory_domains": [ 00:18:09.036 { 00:18:09.036 "dma_device_id": "system", 00:18:09.036 "dma_device_type": 1 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.036 "dma_device_type": 2 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "system", 00:18:09.036 "dma_device_type": 1 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.036 "dma_device_type": 2 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "system", 00:18:09.036 "dma_device_type": 1 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.036 "dma_device_type": 2 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "system", 00:18:09.036 "dma_device_type": 1 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.036 "dma_device_type": 2 00:18:09.036 } 00:18:09.036 ], 00:18:09.036 "driver_specific": { 00:18:09.036 "raid": { 00:18:09.036 "uuid": "968e4fc4-2af8-49fc-8d85-dfc020bdeb3d", 00:18:09.036 "strip_size_kb": 64, 00:18:09.036 "state": "online", 00:18:09.036 "raid_level": "concat", 00:18:09.036 "superblock": false, 00:18:09.036 "num_base_bdevs": 4, 00:18:09.036 "num_base_bdevs_discovered": 4, 00:18:09.036 "num_base_bdevs_operational": 4, 00:18:09.036 "base_bdevs_list": [ 00:18:09.036 { 00:18:09.036 "name": "BaseBdev1", 00:18:09.036 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:09.036 "is_configured": true, 00:18:09.036 "data_offset": 0, 00:18:09.036 "data_size": 65536 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "name": "BaseBdev2", 00:18:09.036 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:09.036 "is_configured": true, 00:18:09.036 "data_offset": 0, 00:18:09.036 "data_size": 65536 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "name": "BaseBdev3", 00:18:09.036 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:09.036 "is_configured": true, 00:18:09.036 "data_offset": 0, 00:18:09.036 "data_size": 65536 00:18:09.036 }, 00:18:09.036 { 00:18:09.036 "name": "BaseBdev4", 00:18:09.036 "uuid": "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf", 00:18:09.036 "is_configured": true, 00:18:09.036 "data_offset": 0, 00:18:09.036 "data_size": 65536 00:18:09.036 } 00:18:09.036 ] 00:18:09.036 } 00:18:09.036 } 00:18:09.036 }' 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:09.036 BaseBdev2 00:18:09.036 BaseBdev3 00:18:09.036 BaseBdev4' 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:09.036 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:09.334 "name": "BaseBdev1", 00:18:09.334 "aliases": [ 00:18:09.334 "c3193f3b-490e-43b9-ab5d-def605e7e19f" 00:18:09.334 ], 00:18:09.334 "product_name": "Malloc disk", 00:18:09.334 "block_size": 512, 00:18:09.334 "num_blocks": 65536, 00:18:09.334 "uuid": "c3193f3b-490e-43b9-ab5d-def605e7e19f", 00:18:09.334 "assigned_rate_limits": { 00:18:09.334 "rw_ios_per_sec": 0, 00:18:09.334 "rw_mbytes_per_sec": 0, 00:18:09.334 "r_mbytes_per_sec": 0, 00:18:09.334 "w_mbytes_per_sec": 0 00:18:09.334 }, 00:18:09.334 "claimed": true, 00:18:09.334 "claim_type": "exclusive_write", 00:18:09.334 "zoned": false, 00:18:09.334 "supported_io_types": { 00:18:09.334 "read": true, 00:18:09.334 "write": true, 00:18:09.334 "unmap": true, 00:18:09.334 "write_zeroes": true, 00:18:09.334 "flush": true, 00:18:09.334 "reset": true, 00:18:09.334 "compare": false, 00:18:09.334 "compare_and_write": false, 00:18:09.334 "abort": true, 00:18:09.334 "nvme_admin": false, 00:18:09.334 "nvme_io": false 00:18:09.334 }, 00:18:09.334 "memory_domains": [ 00:18:09.334 { 00:18:09.334 "dma_device_id": "system", 00:18:09.334 "dma_device_type": 1 00:18:09.334 }, 00:18:09.334 { 00:18:09.334 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.334 "dma_device_type": 2 00:18:09.334 } 00:18:09.334 ], 00:18:09.334 "driver_specific": {} 00:18:09.334 }' 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:09.334 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:09.593 15:57:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:09.852 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:09.852 "name": "BaseBdev2", 00:18:09.852 "aliases": [ 00:18:09.852 "c2a249df-14d0-4570-8bbb-723185963338" 00:18:09.852 ], 00:18:09.852 "product_name": "Malloc disk", 00:18:09.852 "block_size": 512, 00:18:09.852 "num_blocks": 65536, 00:18:09.852 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:09.852 "assigned_rate_limits": { 00:18:09.852 "rw_ios_per_sec": 0, 00:18:09.852 "rw_mbytes_per_sec": 0, 00:18:09.852 "r_mbytes_per_sec": 0, 00:18:09.852 "w_mbytes_per_sec": 0 00:18:09.852 }, 00:18:09.852 "claimed": true, 00:18:09.852 "claim_type": "exclusive_write", 00:18:09.852 "zoned": false, 00:18:09.852 "supported_io_types": { 00:18:09.852 "read": true, 00:18:09.852 "write": true, 00:18:09.852 "unmap": true, 00:18:09.852 "write_zeroes": true, 00:18:09.852 "flush": true, 00:18:09.852 "reset": true, 00:18:09.852 "compare": false, 00:18:09.852 "compare_and_write": false, 00:18:09.852 "abort": true, 00:18:09.852 "nvme_admin": false, 00:18:09.852 "nvme_io": false 00:18:09.852 }, 00:18:09.852 "memory_domains": [ 00:18:09.852 { 00:18:09.852 "dma_device_id": "system", 00:18:09.852 "dma_device_type": 1 00:18:09.852 }, 00:18:09.852 { 00:18:09.852 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:09.852 "dma_device_type": 2 00:18:09.852 } 00:18:09.852 ], 00:18:09.852 "driver_specific": {} 00:18:09.852 }' 00:18:09.852 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.852 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:09.852 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:09.852 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.111 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.112 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.112 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.112 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.112 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:10.112 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:10.371 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:10.371 "name": "BaseBdev3", 00:18:10.371 "aliases": [ 00:18:10.371 "f82ef620-31cc-403f-bafe-fdeba0f103ef" 00:18:10.371 ], 00:18:10.371 "product_name": "Malloc disk", 00:18:10.371 "block_size": 512, 00:18:10.371 "num_blocks": 65536, 00:18:10.371 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:10.371 "assigned_rate_limits": { 00:18:10.371 "rw_ios_per_sec": 0, 00:18:10.371 "rw_mbytes_per_sec": 0, 00:18:10.371 "r_mbytes_per_sec": 0, 00:18:10.371 "w_mbytes_per_sec": 0 00:18:10.371 }, 00:18:10.371 "claimed": true, 00:18:10.371 "claim_type": "exclusive_write", 00:18:10.371 "zoned": false, 00:18:10.371 "supported_io_types": { 00:18:10.371 "read": true, 00:18:10.371 "write": true, 00:18:10.371 "unmap": true, 00:18:10.371 "write_zeroes": true, 00:18:10.371 "flush": true, 00:18:10.371 "reset": true, 00:18:10.371 "compare": false, 00:18:10.371 "compare_and_write": false, 00:18:10.371 "abort": true, 00:18:10.371 "nvme_admin": false, 00:18:10.371 "nvme_io": false 00:18:10.371 }, 00:18:10.371 "memory_domains": [ 00:18:10.371 { 00:18:10.371 "dma_device_id": "system", 00:18:10.371 "dma_device_type": 1 00:18:10.371 }, 00:18:10.371 { 00:18:10.371 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:10.371 "dma_device_type": 2 00:18:10.371 } 00:18:10.371 ], 00:18:10.371 "driver_specific": {} 00:18:10.371 }' 00:18:10.371 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.630 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:10.630 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:10.630 15:57:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.630 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:10.630 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:10.630 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.630 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:10.889 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:11.147 "name": "BaseBdev4", 00:18:11.147 "aliases": [ 00:18:11.147 "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf" 00:18:11.147 ], 00:18:11.147 "product_name": "Malloc disk", 00:18:11.147 "block_size": 512, 00:18:11.147 "num_blocks": 65536, 00:18:11.147 "uuid": "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf", 00:18:11.147 "assigned_rate_limits": { 00:18:11.147 "rw_ios_per_sec": 0, 00:18:11.147 "rw_mbytes_per_sec": 0, 00:18:11.147 "r_mbytes_per_sec": 0, 00:18:11.147 "w_mbytes_per_sec": 0 00:18:11.147 }, 00:18:11.147 "claimed": true, 00:18:11.147 "claim_type": "exclusive_write", 00:18:11.147 "zoned": false, 00:18:11.147 "supported_io_types": { 00:18:11.147 "read": true, 00:18:11.147 "write": true, 00:18:11.147 "unmap": true, 00:18:11.147 "write_zeroes": true, 00:18:11.147 "flush": true, 00:18:11.147 "reset": true, 00:18:11.147 "compare": false, 00:18:11.147 "compare_and_write": false, 00:18:11.147 "abort": true, 00:18:11.147 "nvme_admin": false, 00:18:11.147 "nvme_io": false 00:18:11.147 }, 00:18:11.147 "memory_domains": [ 00:18:11.147 { 00:18:11.147 "dma_device_id": "system", 00:18:11.147 "dma_device_type": 1 00:18:11.147 }, 00:18:11.147 { 00:18:11.147 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:11.147 "dma_device_type": 2 00:18:11.147 } 00:18:11.147 ], 00:18:11.147 "driver_specific": {} 00:18:11.147 }' 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.147 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:11.405 15:57:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:11.664 [2024-06-10 15:57:17.080832] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:11.664 [2024-06-10 15:57:17.080856] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:11.664 [2024-06-10 15:57:17.080900] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@215 -- # return 1 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:11.664 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:11.923 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:11.923 "name": "Existed_Raid", 00:18:11.923 "uuid": "968e4fc4-2af8-49fc-8d85-dfc020bdeb3d", 00:18:11.923 "strip_size_kb": 64, 00:18:11.923 "state": "offline", 00:18:11.923 "raid_level": "concat", 00:18:11.923 "superblock": false, 00:18:11.923 "num_base_bdevs": 4, 00:18:11.923 "num_base_bdevs_discovered": 3, 00:18:11.923 "num_base_bdevs_operational": 3, 00:18:11.923 "base_bdevs_list": [ 00:18:11.923 { 00:18:11.923 "name": null, 00:18:11.923 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:11.923 "is_configured": false, 00:18:11.923 "data_offset": 0, 00:18:11.923 "data_size": 65536 00:18:11.923 }, 00:18:11.923 { 00:18:11.923 "name": "BaseBdev2", 00:18:11.923 "uuid": "c2a249df-14d0-4570-8bbb-723185963338", 00:18:11.923 "is_configured": true, 00:18:11.923 "data_offset": 0, 00:18:11.923 "data_size": 65536 00:18:11.923 }, 00:18:11.923 { 00:18:11.923 "name": "BaseBdev3", 00:18:11.923 "uuid": "f82ef620-31cc-403f-bafe-fdeba0f103ef", 00:18:11.923 "is_configured": true, 00:18:11.923 "data_offset": 0, 00:18:11.923 "data_size": 65536 00:18:11.923 }, 00:18:11.923 { 00:18:11.923 "name": "BaseBdev4", 00:18:11.923 "uuid": "c3c5e114-4dbc-45eb-bd33-d8427a8f0bcf", 00:18:11.923 "is_configured": true, 00:18:11.923 "data_offset": 0, 00:18:11.923 "data_size": 65536 00:18:11.923 } 00:18:11.923 ] 00:18:11.923 }' 00:18:11.923 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:11.923 15:57:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:12.489 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:12.489 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:12.489 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:12.489 15:57:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:12.747 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:12.747 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:12.747 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:13.005 [2024-06-10 15:57:18.481786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:13.005 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:13.006 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:13.006 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.006 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:13.572 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:13.573 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:13.573 15:57:18 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:13.573 [2024-06-10 15:57:19.013618] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:13.573 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:13.573 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:13.573 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:13.573 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:13.831 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:13.831 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:13.831 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:14.089 [2024-06-10 15:57:19.533553] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:14.089 [2024-06-10 15:57:19.533591] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc52ac0 name Existed_Raid, state offline 00:18:14.089 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:14.089 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:14.089 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:14.089 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:14.346 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:14.346 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:14.346 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:14.346 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:14.346 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:14.347 15:57:19 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:14.603 BaseBdev2 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:14.603 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:14.861 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:15.119 [ 00:18:15.119 { 00:18:15.119 "name": "BaseBdev2", 00:18:15.119 "aliases": [ 00:18:15.119 "5c79bfc4-34d0-413f-9e7c-68af321593bf" 00:18:15.119 ], 00:18:15.119 "product_name": "Malloc disk", 00:18:15.119 "block_size": 512, 00:18:15.119 "num_blocks": 65536, 00:18:15.119 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:15.119 "assigned_rate_limits": { 00:18:15.119 "rw_ios_per_sec": 0, 00:18:15.119 "rw_mbytes_per_sec": 0, 00:18:15.119 "r_mbytes_per_sec": 0, 00:18:15.119 "w_mbytes_per_sec": 0 00:18:15.119 }, 00:18:15.119 "claimed": false, 00:18:15.119 "zoned": false, 00:18:15.120 "supported_io_types": { 00:18:15.120 "read": true, 00:18:15.120 "write": true, 00:18:15.120 "unmap": true, 00:18:15.120 "write_zeroes": true, 00:18:15.120 "flush": true, 00:18:15.120 "reset": true, 00:18:15.120 "compare": false, 00:18:15.120 "compare_and_write": false, 00:18:15.120 "abort": true, 00:18:15.120 "nvme_admin": false, 00:18:15.120 "nvme_io": false 00:18:15.120 }, 00:18:15.120 "memory_domains": [ 00:18:15.120 { 00:18:15.120 "dma_device_id": "system", 00:18:15.120 "dma_device_type": 1 00:18:15.120 }, 00:18:15.120 { 00:18:15.120 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.120 "dma_device_type": 2 00:18:15.120 } 00:18:15.120 ], 00:18:15.120 "driver_specific": {} 00:18:15.120 } 00:18:15.120 ] 00:18:15.120 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:15.120 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:15.120 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:15.120 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:15.378 BaseBdev3 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:15.378 15:57:20 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:15.636 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:15.896 [ 00:18:15.896 { 00:18:15.896 "name": "BaseBdev3", 00:18:15.896 "aliases": [ 00:18:15.896 "4f2eaeb3-e193-408b-bab4-57aeca3efa39" 00:18:15.896 ], 00:18:15.896 "product_name": "Malloc disk", 00:18:15.896 "block_size": 512, 00:18:15.896 "num_blocks": 65536, 00:18:15.896 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:15.896 "assigned_rate_limits": { 00:18:15.896 "rw_ios_per_sec": 0, 00:18:15.896 "rw_mbytes_per_sec": 0, 00:18:15.896 "r_mbytes_per_sec": 0, 00:18:15.896 "w_mbytes_per_sec": 0 00:18:15.896 }, 00:18:15.896 "claimed": false, 00:18:15.896 "zoned": false, 00:18:15.896 "supported_io_types": { 00:18:15.896 "read": true, 00:18:15.896 "write": true, 00:18:15.896 "unmap": true, 00:18:15.896 "write_zeroes": true, 00:18:15.896 "flush": true, 00:18:15.896 "reset": true, 00:18:15.896 "compare": false, 00:18:15.896 "compare_and_write": false, 00:18:15.896 "abort": true, 00:18:15.896 "nvme_admin": false, 00:18:15.896 "nvme_io": false 00:18:15.896 }, 00:18:15.896 "memory_domains": [ 00:18:15.896 { 00:18:15.896 "dma_device_id": "system", 00:18:15.896 "dma_device_type": 1 00:18:15.896 }, 00:18:15.896 { 00:18:15.896 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:15.896 "dma_device_type": 2 00:18:15.896 } 00:18:15.896 ], 00:18:15.896 "driver_specific": {} 00:18:15.896 } 00:18:15.896 ] 00:18:15.896 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:15.896 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:15.896 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:15.896 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:16.155 BaseBdev4 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:16.155 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:16.414 15:57:21 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:16.672 [ 00:18:16.672 { 00:18:16.672 "name": "BaseBdev4", 00:18:16.672 "aliases": [ 00:18:16.672 "5c27ee7d-82ce-4de8-b593-29505223651a" 00:18:16.672 ], 00:18:16.672 "product_name": "Malloc disk", 00:18:16.672 "block_size": 512, 00:18:16.672 "num_blocks": 65536, 00:18:16.672 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:16.672 "assigned_rate_limits": { 00:18:16.672 "rw_ios_per_sec": 0, 00:18:16.672 "rw_mbytes_per_sec": 0, 00:18:16.672 "r_mbytes_per_sec": 0, 00:18:16.672 "w_mbytes_per_sec": 0 00:18:16.672 }, 00:18:16.672 "claimed": false, 00:18:16.672 "zoned": false, 00:18:16.672 "supported_io_types": { 00:18:16.672 "read": true, 00:18:16.672 "write": true, 00:18:16.672 "unmap": true, 00:18:16.672 "write_zeroes": true, 00:18:16.672 "flush": true, 00:18:16.672 "reset": true, 00:18:16.672 "compare": false, 00:18:16.672 "compare_and_write": false, 00:18:16.672 "abort": true, 00:18:16.672 "nvme_admin": false, 00:18:16.672 "nvme_io": false 00:18:16.672 }, 00:18:16.672 "memory_domains": [ 00:18:16.672 { 00:18:16.672 "dma_device_id": "system", 00:18:16.672 "dma_device_type": 1 00:18:16.672 }, 00:18:16.672 { 00:18:16.672 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:16.672 "dma_device_type": 2 00:18:16.672 } 00:18:16.672 ], 00:18:16.672 "driver_specific": {} 00:18:16.672 } 00:18:16.672 ] 00:18:16.672 15:57:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:16.672 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:16.672 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:16.672 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:16.931 [2024-06-10 15:57:22.336673] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:16.931 [2024-06-10 15:57:22.336714] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:16.931 [2024-06-10 15:57:22.336731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:16.931 [2024-06-10 15:57:22.338117] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:16.931 [2024-06-10 15:57:22.338158] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:16.931 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:17.189 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:17.189 "name": "Existed_Raid", 00:18:17.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.189 "strip_size_kb": 64, 00:18:17.189 "state": "configuring", 00:18:17.189 "raid_level": "concat", 00:18:17.189 "superblock": false, 00:18:17.189 "num_base_bdevs": 4, 00:18:17.189 "num_base_bdevs_discovered": 3, 00:18:17.189 "num_base_bdevs_operational": 4, 00:18:17.189 "base_bdevs_list": [ 00:18:17.189 { 00:18:17.189 "name": "BaseBdev1", 00:18:17.189 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:17.189 "is_configured": false, 00:18:17.189 "data_offset": 0, 00:18:17.189 "data_size": 0 00:18:17.189 }, 00:18:17.189 { 00:18:17.189 "name": "BaseBdev2", 00:18:17.189 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:17.189 "is_configured": true, 00:18:17.189 "data_offset": 0, 00:18:17.189 "data_size": 65536 00:18:17.189 }, 00:18:17.189 { 00:18:17.189 "name": "BaseBdev3", 00:18:17.189 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:17.189 "is_configured": true, 00:18:17.189 "data_offset": 0, 00:18:17.189 "data_size": 65536 00:18:17.189 }, 00:18:17.189 { 00:18:17.189 "name": "BaseBdev4", 00:18:17.189 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:17.189 "is_configured": true, 00:18:17.189 "data_offset": 0, 00:18:17.189 "data_size": 65536 00:18:17.189 } 00:18:17.189 ] 00:18:17.189 }' 00:18:17.189 15:57:22 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:17.189 15:57:22 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:17.755 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:18.014 [2024-06-10 15:57:23.355387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:18.014 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:18.272 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:18.272 "name": "Existed_Raid", 00:18:18.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.272 "strip_size_kb": 64, 00:18:18.272 "state": "configuring", 00:18:18.272 "raid_level": "concat", 00:18:18.272 "superblock": false, 00:18:18.272 "num_base_bdevs": 4, 00:18:18.272 "num_base_bdevs_discovered": 2, 00:18:18.272 "num_base_bdevs_operational": 4, 00:18:18.272 "base_bdevs_list": [ 00:18:18.272 { 00:18:18.272 "name": "BaseBdev1", 00:18:18.272 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:18.272 "is_configured": false, 00:18:18.272 "data_offset": 0, 00:18:18.272 "data_size": 0 00:18:18.272 }, 00:18:18.272 { 00:18:18.272 "name": null, 00:18:18.272 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:18.272 "is_configured": false, 00:18:18.272 "data_offset": 0, 00:18:18.272 "data_size": 65536 00:18:18.272 }, 00:18:18.272 { 00:18:18.272 "name": "BaseBdev3", 00:18:18.272 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:18.272 "is_configured": true, 00:18:18.272 "data_offset": 0, 00:18:18.272 "data_size": 65536 00:18:18.272 }, 00:18:18.272 { 00:18:18.272 "name": "BaseBdev4", 00:18:18.272 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:18.272 "is_configured": true, 00:18:18.272 "data_offset": 0, 00:18:18.272 "data_size": 65536 00:18:18.272 } 00:18:18.272 ] 00:18:18.272 }' 00:18:18.272 15:57:23 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:18.272 15:57:23 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:18.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:18.840 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.098 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:19.098 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:19.356 [2024-06-10 15:57:24.686264] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:19.356 BaseBdev1 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:19.356 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:19.615 15:57:24 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:19.874 [ 00:18:19.874 { 00:18:19.874 "name": "BaseBdev1", 00:18:19.874 "aliases": [ 00:18:19.874 "8f22e44d-8f18-4b37-a149-e1c7cbd605d2" 00:18:19.874 ], 00:18:19.874 "product_name": "Malloc disk", 00:18:19.874 "block_size": 512, 00:18:19.874 "num_blocks": 65536, 00:18:19.874 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:19.874 "assigned_rate_limits": { 00:18:19.874 "rw_ios_per_sec": 0, 00:18:19.874 "rw_mbytes_per_sec": 0, 00:18:19.874 "r_mbytes_per_sec": 0, 00:18:19.874 "w_mbytes_per_sec": 0 00:18:19.874 }, 00:18:19.874 "claimed": true, 00:18:19.874 "claim_type": "exclusive_write", 00:18:19.874 "zoned": false, 00:18:19.874 "supported_io_types": { 00:18:19.874 "read": true, 00:18:19.874 "write": true, 00:18:19.874 "unmap": true, 00:18:19.874 "write_zeroes": true, 00:18:19.874 "flush": true, 00:18:19.874 "reset": true, 00:18:19.874 "compare": false, 00:18:19.874 "compare_and_write": false, 00:18:19.874 "abort": true, 00:18:19.874 "nvme_admin": false, 00:18:19.874 "nvme_io": false 00:18:19.874 }, 00:18:19.874 "memory_domains": [ 00:18:19.874 { 00:18:19.874 "dma_device_id": "system", 00:18:19.874 "dma_device_type": 1 00:18:19.874 }, 00:18:19.874 { 00:18:19.874 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:19.874 "dma_device_type": 2 00:18:19.874 } 00:18:19.874 ], 00:18:19.874 "driver_specific": {} 00:18:19.874 } 00:18:19.874 ] 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:19.874 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:20.133 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:20.133 "name": "Existed_Raid", 00:18:20.133 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:20.133 "strip_size_kb": 64, 00:18:20.133 "state": "configuring", 00:18:20.133 "raid_level": "concat", 00:18:20.133 "superblock": false, 00:18:20.133 "num_base_bdevs": 4, 00:18:20.133 "num_base_bdevs_discovered": 3, 00:18:20.133 "num_base_bdevs_operational": 4, 00:18:20.133 "base_bdevs_list": [ 00:18:20.133 { 00:18:20.133 "name": "BaseBdev1", 00:18:20.133 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:20.133 "is_configured": true, 00:18:20.133 "data_offset": 0, 00:18:20.133 "data_size": 65536 00:18:20.133 }, 00:18:20.133 { 00:18:20.133 "name": null, 00:18:20.133 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:20.133 "is_configured": false, 00:18:20.133 "data_offset": 0, 00:18:20.133 "data_size": 65536 00:18:20.133 }, 00:18:20.133 { 00:18:20.133 "name": "BaseBdev3", 00:18:20.133 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:20.133 "is_configured": true, 00:18:20.133 "data_offset": 0, 00:18:20.133 "data_size": 65536 00:18:20.133 }, 00:18:20.133 { 00:18:20.133 "name": "BaseBdev4", 00:18:20.133 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:20.133 "is_configured": true, 00:18:20.133 "data_offset": 0, 00:18:20.133 "data_size": 65536 00:18:20.133 } 00:18:20.133 ] 00:18:20.133 }' 00:18:20.133 15:57:25 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:20.133 15:57:25 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:20.700 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:20.700 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:20.959 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:20.959 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:21.218 [2024-06-10 15:57:26.583412] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:21.218 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:21.477 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:21.477 "name": "Existed_Raid", 00:18:21.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:21.477 "strip_size_kb": 64, 00:18:21.477 "state": "configuring", 00:18:21.477 "raid_level": "concat", 00:18:21.477 "superblock": false, 00:18:21.477 "num_base_bdevs": 4, 00:18:21.477 "num_base_bdevs_discovered": 2, 00:18:21.477 "num_base_bdevs_operational": 4, 00:18:21.477 "base_bdevs_list": [ 00:18:21.477 { 00:18:21.477 "name": "BaseBdev1", 00:18:21.477 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:21.477 "is_configured": true, 00:18:21.477 "data_offset": 0, 00:18:21.477 "data_size": 65536 00:18:21.477 }, 00:18:21.477 { 00:18:21.477 "name": null, 00:18:21.477 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:21.477 "is_configured": false, 00:18:21.477 "data_offset": 0, 00:18:21.477 "data_size": 65536 00:18:21.477 }, 00:18:21.477 { 00:18:21.477 "name": null, 00:18:21.477 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:21.477 "is_configured": false, 00:18:21.477 "data_offset": 0, 00:18:21.477 "data_size": 65536 00:18:21.477 }, 00:18:21.477 { 00:18:21.477 "name": "BaseBdev4", 00:18:21.477 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:21.477 "is_configured": true, 00:18:21.477 "data_offset": 0, 00:18:21.477 "data_size": 65536 00:18:21.477 } 00:18:21.477 ] 00:18:21.477 }' 00:18:21.477 15:57:26 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:21.477 15:57:26 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:22.045 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.045 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:22.303 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:22.303 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:22.562 [2024-06-10 15:57:27.975162] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:22.562 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:22.562 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:22.563 15:57:27 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:22.821 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:22.821 "name": "Existed_Raid", 00:18:22.821 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:22.821 "strip_size_kb": 64, 00:18:22.821 "state": "configuring", 00:18:22.821 "raid_level": "concat", 00:18:22.821 "superblock": false, 00:18:22.821 "num_base_bdevs": 4, 00:18:22.821 "num_base_bdevs_discovered": 3, 00:18:22.821 "num_base_bdevs_operational": 4, 00:18:22.821 "base_bdevs_list": [ 00:18:22.821 { 00:18:22.821 "name": "BaseBdev1", 00:18:22.821 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:22.821 "is_configured": true, 00:18:22.821 "data_offset": 0, 00:18:22.821 "data_size": 65536 00:18:22.821 }, 00:18:22.821 { 00:18:22.821 "name": null, 00:18:22.821 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:22.821 "is_configured": false, 00:18:22.821 "data_offset": 0, 00:18:22.821 "data_size": 65536 00:18:22.821 }, 00:18:22.821 { 00:18:22.821 "name": "BaseBdev3", 00:18:22.821 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:22.821 "is_configured": true, 00:18:22.821 "data_offset": 0, 00:18:22.821 "data_size": 65536 00:18:22.821 }, 00:18:22.821 { 00:18:22.821 "name": "BaseBdev4", 00:18:22.821 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:22.822 "is_configured": true, 00:18:22.822 "data_offset": 0, 00:18:22.822 "data_size": 65536 00:18:22.822 } 00:18:22.822 ] 00:18:22.822 }' 00:18:22.822 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:22.822 15:57:28 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:23.390 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:23.390 15:57:28 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:23.649 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:23.649 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:23.908 [2024-06-10 15:57:29.378967] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:23.908 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:24.167 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:24.167 "name": "Existed_Raid", 00:18:24.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:24.167 "strip_size_kb": 64, 00:18:24.167 "state": "configuring", 00:18:24.167 "raid_level": "concat", 00:18:24.167 "superblock": false, 00:18:24.167 "num_base_bdevs": 4, 00:18:24.167 "num_base_bdevs_discovered": 2, 00:18:24.167 "num_base_bdevs_operational": 4, 00:18:24.167 "base_bdevs_list": [ 00:18:24.167 { 00:18:24.167 "name": null, 00:18:24.167 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:24.167 "is_configured": false, 00:18:24.167 "data_offset": 0, 00:18:24.167 "data_size": 65536 00:18:24.167 }, 00:18:24.167 { 00:18:24.167 "name": null, 00:18:24.167 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:24.167 "is_configured": false, 00:18:24.167 "data_offset": 0, 00:18:24.167 "data_size": 65536 00:18:24.167 }, 00:18:24.167 { 00:18:24.167 "name": "BaseBdev3", 00:18:24.167 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:24.167 "is_configured": true, 00:18:24.167 "data_offset": 0, 00:18:24.167 "data_size": 65536 00:18:24.167 }, 00:18:24.167 { 00:18:24.167 "name": "BaseBdev4", 00:18:24.167 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:24.167 "is_configured": true, 00:18:24.167 "data_offset": 0, 00:18:24.167 "data_size": 65536 00:18:24.167 } 00:18:24.167 ] 00:18:24.167 }' 00:18:24.167 15:57:29 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:24.167 15:57:29 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:25.105 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.105 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:25.105 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:25.105 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:25.364 [2024-06-10 15:57:30.785138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:25.364 15:57:30 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:25.623 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:25.623 "name": "Existed_Raid", 00:18:25.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:25.623 "strip_size_kb": 64, 00:18:25.623 "state": "configuring", 00:18:25.623 "raid_level": "concat", 00:18:25.623 "superblock": false, 00:18:25.623 "num_base_bdevs": 4, 00:18:25.623 "num_base_bdevs_discovered": 3, 00:18:25.623 "num_base_bdevs_operational": 4, 00:18:25.623 "base_bdevs_list": [ 00:18:25.623 { 00:18:25.623 "name": null, 00:18:25.623 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:25.623 "is_configured": false, 00:18:25.623 "data_offset": 0, 00:18:25.623 "data_size": 65536 00:18:25.623 }, 00:18:25.623 { 00:18:25.623 "name": "BaseBdev2", 00:18:25.623 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:25.623 "is_configured": true, 00:18:25.623 "data_offset": 0, 00:18:25.623 "data_size": 65536 00:18:25.623 }, 00:18:25.623 { 00:18:25.623 "name": "BaseBdev3", 00:18:25.623 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:25.623 "is_configured": true, 00:18:25.623 "data_offset": 0, 00:18:25.623 "data_size": 65536 00:18:25.623 }, 00:18:25.623 { 00:18:25.623 "name": "BaseBdev4", 00:18:25.623 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:25.623 "is_configured": true, 00:18:25.623 "data_offset": 0, 00:18:25.623 "data_size": 65536 00:18:25.623 } 00:18:25.623 ] 00:18:25.623 }' 00:18:25.623 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:25.623 15:57:31 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:26.191 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.191 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:26.450 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:26.450 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:26.450 15:57:31 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:18:26.713 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 8f22e44d-8f18-4b37-a149-e1c7cbd605d2 00:18:26.972 [2024-06-10 15:57:32.440946] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:18:26.972 [2024-06-10 15:57:32.440990] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdf9b60 00:18:26.972 [2024-06-10 15:57:32.440998] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 262144, blocklen 512 00:18:26.972 [2024-06-10 15:57:32.441199] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc52980 00:18:26.972 [2024-06-10 15:57:32.441322] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdf9b60 00:18:26.972 [2024-06-10 15:57:32.441330] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xdf9b60 00:18:26.972 [2024-06-10 15:57:32.441492] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:26.972 NewBaseBdev 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:26.972 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:27.231 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:18:27.490 [ 00:18:27.490 { 00:18:27.490 "name": "NewBaseBdev", 00:18:27.490 "aliases": [ 00:18:27.490 "8f22e44d-8f18-4b37-a149-e1c7cbd605d2" 00:18:27.490 ], 00:18:27.490 "product_name": "Malloc disk", 00:18:27.490 "block_size": 512, 00:18:27.490 "num_blocks": 65536, 00:18:27.490 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:27.490 "assigned_rate_limits": { 00:18:27.490 "rw_ios_per_sec": 0, 00:18:27.490 "rw_mbytes_per_sec": 0, 00:18:27.490 "r_mbytes_per_sec": 0, 00:18:27.490 "w_mbytes_per_sec": 0 00:18:27.490 }, 00:18:27.490 "claimed": true, 00:18:27.490 "claim_type": "exclusive_write", 00:18:27.490 "zoned": false, 00:18:27.490 "supported_io_types": { 00:18:27.490 "read": true, 00:18:27.490 "write": true, 00:18:27.490 "unmap": true, 00:18:27.490 "write_zeroes": true, 00:18:27.490 "flush": true, 00:18:27.491 "reset": true, 00:18:27.491 "compare": false, 00:18:27.491 "compare_and_write": false, 00:18:27.491 "abort": true, 00:18:27.491 "nvme_admin": false, 00:18:27.491 "nvme_io": false 00:18:27.491 }, 00:18:27.491 "memory_domains": [ 00:18:27.491 { 00:18:27.491 "dma_device_id": "system", 00:18:27.491 "dma_device_type": 1 00:18:27.491 }, 00:18:27.491 { 00:18:27.491 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:27.491 "dma_device_type": 2 00:18:27.491 } 00:18:27.491 ], 00:18:27.491 "driver_specific": {} 00:18:27.491 } 00:18:27.491 ] 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:27.491 15:57:32 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:27.779 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:27.779 "name": "Existed_Raid", 00:18:27.779 "uuid": "976945e6-ae79-45f1-aea5-6531627b91aa", 00:18:27.779 "strip_size_kb": 64, 00:18:27.779 "state": "online", 00:18:27.779 "raid_level": "concat", 00:18:27.779 "superblock": false, 00:18:27.779 "num_base_bdevs": 4, 00:18:27.779 "num_base_bdevs_discovered": 4, 00:18:27.779 "num_base_bdevs_operational": 4, 00:18:27.779 "base_bdevs_list": [ 00:18:27.779 { 00:18:27.779 "name": "NewBaseBdev", 00:18:27.779 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:27.779 "is_configured": true, 00:18:27.779 "data_offset": 0, 00:18:27.779 "data_size": 65536 00:18:27.779 }, 00:18:27.779 { 00:18:27.779 "name": "BaseBdev2", 00:18:27.779 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:27.779 "is_configured": true, 00:18:27.779 "data_offset": 0, 00:18:27.779 "data_size": 65536 00:18:27.779 }, 00:18:27.779 { 00:18:27.779 "name": "BaseBdev3", 00:18:27.779 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:27.779 "is_configured": true, 00:18:27.779 "data_offset": 0, 00:18:27.779 "data_size": 65536 00:18:27.779 }, 00:18:27.779 { 00:18:27.779 "name": "BaseBdev4", 00:18:27.779 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:27.779 "is_configured": true, 00:18:27.779 "data_offset": 0, 00:18:27.779 "data_size": 65536 00:18:27.779 } 00:18:27.779 ] 00:18:27.779 }' 00:18:27.779 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:27.779 15:57:33 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:28.347 15:57:33 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:28.606 [2024-06-10 15:57:34.089697] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:28.606 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:28.606 "name": "Existed_Raid", 00:18:28.606 "aliases": [ 00:18:28.606 "976945e6-ae79-45f1-aea5-6531627b91aa" 00:18:28.606 ], 00:18:28.606 "product_name": "Raid Volume", 00:18:28.606 "block_size": 512, 00:18:28.606 "num_blocks": 262144, 00:18:28.606 "uuid": "976945e6-ae79-45f1-aea5-6531627b91aa", 00:18:28.606 "assigned_rate_limits": { 00:18:28.606 "rw_ios_per_sec": 0, 00:18:28.606 "rw_mbytes_per_sec": 0, 00:18:28.606 "r_mbytes_per_sec": 0, 00:18:28.606 "w_mbytes_per_sec": 0 00:18:28.606 }, 00:18:28.606 "claimed": false, 00:18:28.606 "zoned": false, 00:18:28.606 "supported_io_types": { 00:18:28.606 "read": true, 00:18:28.606 "write": true, 00:18:28.606 "unmap": true, 00:18:28.606 "write_zeroes": true, 00:18:28.606 "flush": true, 00:18:28.606 "reset": true, 00:18:28.606 "compare": false, 00:18:28.606 "compare_and_write": false, 00:18:28.606 "abort": false, 00:18:28.606 "nvme_admin": false, 00:18:28.606 "nvme_io": false 00:18:28.606 }, 00:18:28.606 "memory_domains": [ 00:18:28.607 { 00:18:28.607 "dma_device_id": "system", 00:18:28.607 "dma_device_type": 1 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.607 "dma_device_type": 2 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "system", 00:18:28.607 "dma_device_type": 1 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.607 "dma_device_type": 2 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "system", 00:18:28.607 "dma_device_type": 1 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.607 "dma_device_type": 2 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "system", 00:18:28.607 "dma_device_type": 1 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:28.607 "dma_device_type": 2 00:18:28.607 } 00:18:28.607 ], 00:18:28.607 "driver_specific": { 00:18:28.607 "raid": { 00:18:28.607 "uuid": "976945e6-ae79-45f1-aea5-6531627b91aa", 00:18:28.607 "strip_size_kb": 64, 00:18:28.607 "state": "online", 00:18:28.607 "raid_level": "concat", 00:18:28.607 "superblock": false, 00:18:28.607 "num_base_bdevs": 4, 00:18:28.607 "num_base_bdevs_discovered": 4, 00:18:28.607 "num_base_bdevs_operational": 4, 00:18:28.607 "base_bdevs_list": [ 00:18:28.607 { 00:18:28.607 "name": "NewBaseBdev", 00:18:28.607 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:28.607 "is_configured": true, 00:18:28.607 "data_offset": 0, 00:18:28.607 "data_size": 65536 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "name": "BaseBdev2", 00:18:28.607 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:28.607 "is_configured": true, 00:18:28.607 "data_offset": 0, 00:18:28.607 "data_size": 65536 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "name": "BaseBdev3", 00:18:28.607 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:28.607 "is_configured": true, 00:18:28.607 "data_offset": 0, 00:18:28.607 "data_size": 65536 00:18:28.607 }, 00:18:28.607 { 00:18:28.607 "name": "BaseBdev4", 00:18:28.607 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:28.607 "is_configured": true, 00:18:28.607 "data_offset": 0, 00:18:28.607 "data_size": 65536 00:18:28.607 } 00:18:28.607 ] 00:18:28.607 } 00:18:28.607 } 00:18:28.607 }' 00:18:28.607 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:28.867 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:18:28.867 BaseBdev2 00:18:28.867 BaseBdev3 00:18:28.867 BaseBdev4' 00:18:28.867 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:28.867 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:18:28.867 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.126 "name": "NewBaseBdev", 00:18:29.126 "aliases": [ 00:18:29.126 "8f22e44d-8f18-4b37-a149-e1c7cbd605d2" 00:18:29.126 ], 00:18:29.126 "product_name": "Malloc disk", 00:18:29.126 "block_size": 512, 00:18:29.126 "num_blocks": 65536, 00:18:29.126 "uuid": "8f22e44d-8f18-4b37-a149-e1c7cbd605d2", 00:18:29.126 "assigned_rate_limits": { 00:18:29.126 "rw_ios_per_sec": 0, 00:18:29.126 "rw_mbytes_per_sec": 0, 00:18:29.126 "r_mbytes_per_sec": 0, 00:18:29.126 "w_mbytes_per_sec": 0 00:18:29.126 }, 00:18:29.126 "claimed": true, 00:18:29.126 "claim_type": "exclusive_write", 00:18:29.126 "zoned": false, 00:18:29.126 "supported_io_types": { 00:18:29.126 "read": true, 00:18:29.126 "write": true, 00:18:29.126 "unmap": true, 00:18:29.126 "write_zeroes": true, 00:18:29.126 "flush": true, 00:18:29.126 "reset": true, 00:18:29.126 "compare": false, 00:18:29.126 "compare_and_write": false, 00:18:29.126 "abort": true, 00:18:29.126 "nvme_admin": false, 00:18:29.126 "nvme_io": false 00:18:29.126 }, 00:18:29.126 "memory_domains": [ 00:18:29.126 { 00:18:29.126 "dma_device_id": "system", 00:18:29.126 "dma_device_type": 1 00:18:29.126 }, 00:18:29.126 { 00:18:29.126 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.126 "dma_device_type": 2 00:18:29.126 } 00:18:29.126 ], 00:18:29.126 "driver_specific": {} 00:18:29.126 }' 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.126 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:29.385 15:57:34 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:29.644 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:29.644 "name": "BaseBdev2", 00:18:29.644 "aliases": [ 00:18:29.644 "5c79bfc4-34d0-413f-9e7c-68af321593bf" 00:18:29.644 ], 00:18:29.644 "product_name": "Malloc disk", 00:18:29.644 "block_size": 512, 00:18:29.644 "num_blocks": 65536, 00:18:29.644 "uuid": "5c79bfc4-34d0-413f-9e7c-68af321593bf", 00:18:29.644 "assigned_rate_limits": { 00:18:29.644 "rw_ios_per_sec": 0, 00:18:29.644 "rw_mbytes_per_sec": 0, 00:18:29.644 "r_mbytes_per_sec": 0, 00:18:29.644 "w_mbytes_per_sec": 0 00:18:29.644 }, 00:18:29.644 "claimed": true, 00:18:29.644 "claim_type": "exclusive_write", 00:18:29.644 "zoned": false, 00:18:29.644 "supported_io_types": { 00:18:29.644 "read": true, 00:18:29.644 "write": true, 00:18:29.644 "unmap": true, 00:18:29.644 "write_zeroes": true, 00:18:29.644 "flush": true, 00:18:29.644 "reset": true, 00:18:29.644 "compare": false, 00:18:29.644 "compare_and_write": false, 00:18:29.644 "abort": true, 00:18:29.644 "nvme_admin": false, 00:18:29.644 "nvme_io": false 00:18:29.644 }, 00:18:29.644 "memory_domains": [ 00:18:29.644 { 00:18:29.644 "dma_device_id": "system", 00:18:29.644 "dma_device_type": 1 00:18:29.644 }, 00:18:29.644 { 00:18:29.644 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:29.644 "dma_device_type": 2 00:18:29.644 } 00:18:29.644 ], 00:18:29.644 "driver_specific": {} 00:18:29.644 }' 00:18:29.644 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.644 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:29.644 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:29.644 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:29.903 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:29.904 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:29.904 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.163 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.163 "name": "BaseBdev3", 00:18:30.163 "aliases": [ 00:18:30.163 "4f2eaeb3-e193-408b-bab4-57aeca3efa39" 00:18:30.163 ], 00:18:30.163 "product_name": "Malloc disk", 00:18:30.163 "block_size": 512, 00:18:30.163 "num_blocks": 65536, 00:18:30.163 "uuid": "4f2eaeb3-e193-408b-bab4-57aeca3efa39", 00:18:30.163 "assigned_rate_limits": { 00:18:30.163 "rw_ios_per_sec": 0, 00:18:30.163 "rw_mbytes_per_sec": 0, 00:18:30.163 "r_mbytes_per_sec": 0, 00:18:30.163 "w_mbytes_per_sec": 0 00:18:30.163 }, 00:18:30.163 "claimed": true, 00:18:30.163 "claim_type": "exclusive_write", 00:18:30.163 "zoned": false, 00:18:30.163 "supported_io_types": { 00:18:30.163 "read": true, 00:18:30.163 "write": true, 00:18:30.163 "unmap": true, 00:18:30.163 "write_zeroes": true, 00:18:30.163 "flush": true, 00:18:30.163 "reset": true, 00:18:30.163 "compare": false, 00:18:30.163 "compare_and_write": false, 00:18:30.163 "abort": true, 00:18:30.163 "nvme_admin": false, 00:18:30.163 "nvme_io": false 00:18:30.163 }, 00:18:30.163 "memory_domains": [ 00:18:30.163 { 00:18:30.163 "dma_device_id": "system", 00:18:30.163 "dma_device_type": 1 00:18:30.163 }, 00:18:30.164 { 00:18:30.164 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.164 "dma_device_type": 2 00:18:30.164 } 00:18:30.164 ], 00:18:30.164 "driver_specific": {} 00:18:30.164 }' 00:18:30.164 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.423 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:30.681 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:30.681 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.681 15:57:35 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:30.681 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:30.681 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:30.681 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:30.681 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:30.940 "name": "BaseBdev4", 00:18:30.940 "aliases": [ 00:18:30.940 "5c27ee7d-82ce-4de8-b593-29505223651a" 00:18:30.940 ], 00:18:30.940 "product_name": "Malloc disk", 00:18:30.940 "block_size": 512, 00:18:30.940 "num_blocks": 65536, 00:18:30.940 "uuid": "5c27ee7d-82ce-4de8-b593-29505223651a", 00:18:30.940 "assigned_rate_limits": { 00:18:30.940 "rw_ios_per_sec": 0, 00:18:30.940 "rw_mbytes_per_sec": 0, 00:18:30.940 "r_mbytes_per_sec": 0, 00:18:30.940 "w_mbytes_per_sec": 0 00:18:30.940 }, 00:18:30.940 "claimed": true, 00:18:30.940 "claim_type": "exclusive_write", 00:18:30.940 "zoned": false, 00:18:30.940 "supported_io_types": { 00:18:30.940 "read": true, 00:18:30.940 "write": true, 00:18:30.940 "unmap": true, 00:18:30.940 "write_zeroes": true, 00:18:30.940 "flush": true, 00:18:30.940 "reset": true, 00:18:30.940 "compare": false, 00:18:30.940 "compare_and_write": false, 00:18:30.940 "abort": true, 00:18:30.940 "nvme_admin": false, 00:18:30.940 "nvme_io": false 00:18:30.940 }, 00:18:30.940 "memory_domains": [ 00:18:30.940 { 00:18:30.940 "dma_device_id": "system", 00:18:30.940 "dma_device_type": 1 00:18:30.940 }, 00:18:30.940 { 00:18:30.940 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:30.940 "dma_device_type": 2 00:18:30.940 } 00:18:30.940 ], 00:18:30.940 "driver_specific": {} 00:18:30.940 }' 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:30.940 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:31.200 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:31.459 [2024-06-10 15:57:36.896960] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:31.459 [2024-06-10 15:57:36.896986] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:31.459 [2024-06-10 15:57:36.897039] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:31.459 [2024-06-10 15:57:36.897099] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:18:31.459 [2024-06-10 15:57:36.897108] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdf9b60 name Existed_Raid, state offline 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2724319 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2724319 ']' 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2724319 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2724319 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2724319' 00:18:31.459 killing process with pid 2724319 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2724319 00:18:31.459 [2024-06-10 15:57:36.963714] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:18:31.459 15:57:36 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2724319 00:18:31.718 [2024-06-10 15:57:36.997615] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:18:31.718 15:57:37 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:18:31.718 00:18:31.719 real 0m33.153s 00:18:31.719 user 1m2.159s 00:18:31.719 sys 0m4.736s 00:18:31.719 15:57:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:18:31.719 15:57:37 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:18:31.719 ************************************ 00:18:31.719 END TEST raid_state_function_test 00:18:31.719 ************************************ 00:18:31.978 15:57:37 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test concat 4 true 00:18:31.978 15:57:37 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:18:31.978 15:57:37 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:18:31.978 15:57:37 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:18:31.978 ************************************ 00:18:31.978 START TEST raid_state_function_test_sb 00:18:31.978 ************************************ 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test concat 4 true 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=concat 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' concat '!=' raid1 ']' 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@231 -- # strip_size=64 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@232 -- # strip_size_create_arg='-z 64' 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2730830 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2730830' 00:18:31.978 Process raid pid: 2730830 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2730830 /var/tmp/spdk-raid.sock 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2730830 ']' 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:18:31.978 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:18:31.978 15:57:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:31.978 [2024-06-10 15:57:37.328107] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:18:31.978 [2024-06-10 15:57:37.328159] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:18:31.978 [2024-06-10 15:57:37.426617] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:18:32.238 [2024-06-10 15:57:37.522082] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:18:32.238 [2024-06-10 15:57:37.578712] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:32.238 [2024-06-10 15:57:37.578742] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:18:32.803 15:57:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:18:32.803 15:57:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:18:32.803 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:33.062 [2024-06-10 15:57:38.517977] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:33.062 [2024-06-10 15:57:38.518017] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:33.062 [2024-06-10 15:57:38.518026] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:33.062 [2024-06-10 15:57:38.518035] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:33.062 [2024-06-10 15:57:38.518042] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:33.062 [2024-06-10 15:57:38.518049] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:33.062 [2024-06-10 15:57:38.518056] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:33.062 [2024-06-10 15:57:38.518064] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:33.062 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:33.321 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:33.321 "name": "Existed_Raid", 00:18:33.321 "uuid": "e454ef41-3117-4def-882f-65ea848739dd", 00:18:33.321 "strip_size_kb": 64, 00:18:33.321 "state": "configuring", 00:18:33.321 "raid_level": "concat", 00:18:33.321 "superblock": true, 00:18:33.321 "num_base_bdevs": 4, 00:18:33.321 "num_base_bdevs_discovered": 0, 00:18:33.321 "num_base_bdevs_operational": 4, 00:18:33.321 "base_bdevs_list": [ 00:18:33.321 { 00:18:33.321 "name": "BaseBdev1", 00:18:33.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.321 "is_configured": false, 00:18:33.321 "data_offset": 0, 00:18:33.321 "data_size": 0 00:18:33.321 }, 00:18:33.321 { 00:18:33.321 "name": "BaseBdev2", 00:18:33.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.321 "is_configured": false, 00:18:33.321 "data_offset": 0, 00:18:33.321 "data_size": 0 00:18:33.321 }, 00:18:33.321 { 00:18:33.321 "name": "BaseBdev3", 00:18:33.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.321 "is_configured": false, 00:18:33.321 "data_offset": 0, 00:18:33.321 "data_size": 0 00:18:33.321 }, 00:18:33.321 { 00:18:33.321 "name": "BaseBdev4", 00:18:33.321 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:33.321 "is_configured": false, 00:18:33.321 "data_offset": 0, 00:18:33.321 "data_size": 0 00:18:33.321 } 00:18:33.321 ] 00:18:33.321 }' 00:18:33.321 15:57:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:33.321 15:57:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:34.258 15:57:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:34.258 [2024-06-10 15:57:39.644836] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:34.258 [2024-06-10 15:57:39.644870] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad8140 name Existed_Raid, state configuring 00:18:34.258 15:57:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:34.518 [2024-06-10 15:57:39.897531] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:34.518 [2024-06-10 15:57:39.897560] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:34.518 [2024-06-10 15:57:39.897568] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:34.518 [2024-06-10 15:57:39.897577] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:34.518 [2024-06-10 15:57:39.897584] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:34.518 [2024-06-10 15:57:39.897592] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:34.518 [2024-06-10 15:57:39.897599] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:34.518 [2024-06-10 15:57:39.897607] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:34.518 15:57:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:34.777 [2024-06-10 15:57:40.163764] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:34.777 BaseBdev1 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:34.777 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:35.036 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:35.295 [ 00:18:35.295 { 00:18:35.295 "name": "BaseBdev1", 00:18:35.295 "aliases": [ 00:18:35.295 "399f2dab-6d4e-4e23-802f-ff25afaab908" 00:18:35.295 ], 00:18:35.295 "product_name": "Malloc disk", 00:18:35.295 "block_size": 512, 00:18:35.295 "num_blocks": 65536, 00:18:35.295 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:35.295 "assigned_rate_limits": { 00:18:35.295 "rw_ios_per_sec": 0, 00:18:35.295 "rw_mbytes_per_sec": 0, 00:18:35.295 "r_mbytes_per_sec": 0, 00:18:35.295 "w_mbytes_per_sec": 0 00:18:35.295 }, 00:18:35.295 "claimed": true, 00:18:35.295 "claim_type": "exclusive_write", 00:18:35.295 "zoned": false, 00:18:35.295 "supported_io_types": { 00:18:35.295 "read": true, 00:18:35.295 "write": true, 00:18:35.295 "unmap": true, 00:18:35.295 "write_zeroes": true, 00:18:35.295 "flush": true, 00:18:35.295 "reset": true, 00:18:35.295 "compare": false, 00:18:35.295 "compare_and_write": false, 00:18:35.295 "abort": true, 00:18:35.295 "nvme_admin": false, 00:18:35.295 "nvme_io": false 00:18:35.295 }, 00:18:35.295 "memory_domains": [ 00:18:35.295 { 00:18:35.295 "dma_device_id": "system", 00:18:35.295 "dma_device_type": 1 00:18:35.295 }, 00:18:35.295 { 00:18:35.295 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:35.295 "dma_device_type": 2 00:18:35.295 } 00:18:35.295 ], 00:18:35.295 "driver_specific": {} 00:18:35.295 } 00:18:35.295 ] 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:35.295 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:35.555 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:35.555 "name": "Existed_Raid", 00:18:35.555 "uuid": "f87d9d7a-f8c2-4d5b-b56a-53fd31fd8412", 00:18:35.555 "strip_size_kb": 64, 00:18:35.555 "state": "configuring", 00:18:35.555 "raid_level": "concat", 00:18:35.555 "superblock": true, 00:18:35.555 "num_base_bdevs": 4, 00:18:35.555 "num_base_bdevs_discovered": 1, 00:18:35.555 "num_base_bdevs_operational": 4, 00:18:35.555 "base_bdevs_list": [ 00:18:35.555 { 00:18:35.555 "name": "BaseBdev1", 00:18:35.555 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:35.555 "is_configured": true, 00:18:35.555 "data_offset": 2048, 00:18:35.555 "data_size": 63488 00:18:35.555 }, 00:18:35.555 { 00:18:35.555 "name": "BaseBdev2", 00:18:35.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.555 "is_configured": false, 00:18:35.555 "data_offset": 0, 00:18:35.555 "data_size": 0 00:18:35.555 }, 00:18:35.555 { 00:18:35.555 "name": "BaseBdev3", 00:18:35.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.555 "is_configured": false, 00:18:35.555 "data_offset": 0, 00:18:35.555 "data_size": 0 00:18:35.555 }, 00:18:35.555 { 00:18:35.555 "name": "BaseBdev4", 00:18:35.555 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:35.555 "is_configured": false, 00:18:35.555 "data_offset": 0, 00:18:35.555 "data_size": 0 00:18:35.555 } 00:18:35.555 ] 00:18:35.555 }' 00:18:35.555 15:57:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:35.555 15:57:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:36.122 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:18:36.381 [2024-06-10 15:57:41.800157] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:18:36.381 [2024-06-10 15:57:41.800194] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad79b0 name Existed_Raid, state configuring 00:18:36.381 15:57:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:36.640 [2024-06-10 15:57:42.052877] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:36.640 [2024-06-10 15:57:42.054443] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:18:36.640 [2024-06-10 15:57:42.054474] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:18:36.640 [2024-06-10 15:57:42.054482] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:18:36.640 [2024-06-10 15:57:42.054490] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:18:36.640 [2024-06-10 15:57:42.054497] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:18:36.640 [2024-06-10 15:57:42.054506] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:36.640 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:36.899 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:36.899 "name": "Existed_Raid", 00:18:36.899 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:36.899 "strip_size_kb": 64, 00:18:36.899 "state": "configuring", 00:18:36.899 "raid_level": "concat", 00:18:36.899 "superblock": true, 00:18:36.899 "num_base_bdevs": 4, 00:18:36.899 "num_base_bdevs_discovered": 1, 00:18:36.899 "num_base_bdevs_operational": 4, 00:18:36.899 "base_bdevs_list": [ 00:18:36.899 { 00:18:36.899 "name": "BaseBdev1", 00:18:36.899 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:36.899 "is_configured": true, 00:18:36.899 "data_offset": 2048, 00:18:36.899 "data_size": 63488 00:18:36.899 }, 00:18:36.899 { 00:18:36.899 "name": "BaseBdev2", 00:18:36.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.899 "is_configured": false, 00:18:36.899 "data_offset": 0, 00:18:36.899 "data_size": 0 00:18:36.899 }, 00:18:36.899 { 00:18:36.899 "name": "BaseBdev3", 00:18:36.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.899 "is_configured": false, 00:18:36.899 "data_offset": 0, 00:18:36.899 "data_size": 0 00:18:36.899 }, 00:18:36.899 { 00:18:36.899 "name": "BaseBdev4", 00:18:36.899 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:36.899 "is_configured": false, 00:18:36.899 "data_offset": 0, 00:18:36.899 "data_size": 0 00:18:36.899 } 00:18:36.899 ] 00:18:36.899 }' 00:18:36.899 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:36.899 15:57:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:37.467 15:57:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:37.725 [2024-06-10 15:57:43.014588] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:37.725 BaseBdev2 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:37.725 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:37.984 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:38.242 [ 00:18:38.242 { 00:18:38.242 "name": "BaseBdev2", 00:18:38.242 "aliases": [ 00:18:38.242 "34d3a591-a94c-4799-a3d5-373f74bfac41" 00:18:38.242 ], 00:18:38.242 "product_name": "Malloc disk", 00:18:38.242 "block_size": 512, 00:18:38.242 "num_blocks": 65536, 00:18:38.242 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:38.242 "assigned_rate_limits": { 00:18:38.242 "rw_ios_per_sec": 0, 00:18:38.242 "rw_mbytes_per_sec": 0, 00:18:38.242 "r_mbytes_per_sec": 0, 00:18:38.242 "w_mbytes_per_sec": 0 00:18:38.242 }, 00:18:38.242 "claimed": true, 00:18:38.242 "claim_type": "exclusive_write", 00:18:38.242 "zoned": false, 00:18:38.242 "supported_io_types": { 00:18:38.242 "read": true, 00:18:38.242 "write": true, 00:18:38.242 "unmap": true, 00:18:38.242 "write_zeroes": true, 00:18:38.243 "flush": true, 00:18:38.243 "reset": true, 00:18:38.243 "compare": false, 00:18:38.243 "compare_and_write": false, 00:18:38.243 "abort": true, 00:18:38.243 "nvme_admin": false, 00:18:38.243 "nvme_io": false 00:18:38.243 }, 00:18:38.243 "memory_domains": [ 00:18:38.243 { 00:18:38.243 "dma_device_id": "system", 00:18:38.243 "dma_device_type": 1 00:18:38.243 }, 00:18:38.243 { 00:18:38.243 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:38.243 "dma_device_type": 2 00:18:38.243 } 00:18:38.243 ], 00:18:38.243 "driver_specific": {} 00:18:38.243 } 00:18:38.243 ] 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:38.243 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:38.502 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:38.502 "name": "Existed_Raid", 00:18:38.502 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:38.502 "strip_size_kb": 64, 00:18:38.502 "state": "configuring", 00:18:38.502 "raid_level": "concat", 00:18:38.502 "superblock": true, 00:18:38.502 "num_base_bdevs": 4, 00:18:38.502 "num_base_bdevs_discovered": 2, 00:18:38.502 "num_base_bdevs_operational": 4, 00:18:38.502 "base_bdevs_list": [ 00:18:38.502 { 00:18:38.502 "name": "BaseBdev1", 00:18:38.502 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:38.502 "is_configured": true, 00:18:38.502 "data_offset": 2048, 00:18:38.502 "data_size": 63488 00:18:38.502 }, 00:18:38.502 { 00:18:38.502 "name": "BaseBdev2", 00:18:38.502 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:38.502 "is_configured": true, 00:18:38.502 "data_offset": 2048, 00:18:38.502 "data_size": 63488 00:18:38.502 }, 00:18:38.502 { 00:18:38.502 "name": "BaseBdev3", 00:18:38.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.502 "is_configured": false, 00:18:38.502 "data_offset": 0, 00:18:38.502 "data_size": 0 00:18:38.502 }, 00:18:38.502 { 00:18:38.502 "name": "BaseBdev4", 00:18:38.502 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:38.502 "is_configured": false, 00:18:38.502 "data_offset": 0, 00:18:38.502 "data_size": 0 00:18:38.502 } 00:18:38.502 ] 00:18:38.502 }' 00:18:38.502 15:57:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:38.502 15:57:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:39.070 15:57:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:39.328 [2024-06-10 15:57:44.638269] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:39.328 BaseBdev3 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:39.328 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:39.587 15:57:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:39.846 [ 00:18:39.846 { 00:18:39.846 "name": "BaseBdev3", 00:18:39.846 "aliases": [ 00:18:39.846 "b65a4f8c-7400-499d-808a-011e8e89a878" 00:18:39.846 ], 00:18:39.846 "product_name": "Malloc disk", 00:18:39.846 "block_size": 512, 00:18:39.846 "num_blocks": 65536, 00:18:39.846 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:39.846 "assigned_rate_limits": { 00:18:39.846 "rw_ios_per_sec": 0, 00:18:39.846 "rw_mbytes_per_sec": 0, 00:18:39.846 "r_mbytes_per_sec": 0, 00:18:39.846 "w_mbytes_per_sec": 0 00:18:39.846 }, 00:18:39.846 "claimed": true, 00:18:39.846 "claim_type": "exclusive_write", 00:18:39.846 "zoned": false, 00:18:39.846 "supported_io_types": { 00:18:39.846 "read": true, 00:18:39.846 "write": true, 00:18:39.846 "unmap": true, 00:18:39.846 "write_zeroes": true, 00:18:39.846 "flush": true, 00:18:39.846 "reset": true, 00:18:39.846 "compare": false, 00:18:39.846 "compare_and_write": false, 00:18:39.846 "abort": true, 00:18:39.846 "nvme_admin": false, 00:18:39.846 "nvme_io": false 00:18:39.846 }, 00:18:39.846 "memory_domains": [ 00:18:39.846 { 00:18:39.846 "dma_device_id": "system", 00:18:39.846 "dma_device_type": 1 00:18:39.846 }, 00:18:39.846 { 00:18:39.846 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:39.846 "dma_device_type": 2 00:18:39.846 } 00:18:39.846 ], 00:18:39.846 "driver_specific": {} 00:18:39.846 } 00:18:39.846 ] 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:39.846 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:40.106 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:40.106 "name": "Existed_Raid", 00:18:40.106 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:40.106 "strip_size_kb": 64, 00:18:40.106 "state": "configuring", 00:18:40.106 "raid_level": "concat", 00:18:40.106 "superblock": true, 00:18:40.106 "num_base_bdevs": 4, 00:18:40.106 "num_base_bdevs_discovered": 3, 00:18:40.106 "num_base_bdevs_operational": 4, 00:18:40.106 "base_bdevs_list": [ 00:18:40.106 { 00:18:40.106 "name": "BaseBdev1", 00:18:40.106 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:40.106 "is_configured": true, 00:18:40.106 "data_offset": 2048, 00:18:40.106 "data_size": 63488 00:18:40.106 }, 00:18:40.106 { 00:18:40.106 "name": "BaseBdev2", 00:18:40.106 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:40.106 "is_configured": true, 00:18:40.106 "data_offset": 2048, 00:18:40.106 "data_size": 63488 00:18:40.106 }, 00:18:40.106 { 00:18:40.106 "name": "BaseBdev3", 00:18:40.106 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:40.106 "is_configured": true, 00:18:40.106 "data_offset": 2048, 00:18:40.106 "data_size": 63488 00:18:40.106 }, 00:18:40.106 { 00:18:40.106 "name": "BaseBdev4", 00:18:40.106 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:40.106 "is_configured": false, 00:18:40.106 "data_offset": 0, 00:18:40.106 "data_size": 0 00:18:40.106 } 00:18:40.106 ] 00:18:40.106 }' 00:18:40.106 15:57:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:40.106 15:57:45 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:40.674 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:40.933 [2024-06-10 15:57:46.249823] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:40.933 [2024-06-10 15:57:46.249990] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ad8ac0 00:18:40.933 [2024-06-10 15:57:46.250002] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:18:40.933 [2024-06-10 15:57:46.250186] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c7dba0 00:18:40.933 [2024-06-10 15:57:46.250314] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ad8ac0 00:18:40.933 [2024-06-10 15:57:46.250323] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1ad8ac0 00:18:40.933 [2024-06-10 15:57:46.250419] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:18:40.933 BaseBdev4 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:40.933 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:41.193 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:41.453 [ 00:18:41.453 { 00:18:41.453 "name": "BaseBdev4", 00:18:41.453 "aliases": [ 00:18:41.453 "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c" 00:18:41.453 ], 00:18:41.453 "product_name": "Malloc disk", 00:18:41.453 "block_size": 512, 00:18:41.453 "num_blocks": 65536, 00:18:41.453 "uuid": "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c", 00:18:41.453 "assigned_rate_limits": { 00:18:41.453 "rw_ios_per_sec": 0, 00:18:41.453 "rw_mbytes_per_sec": 0, 00:18:41.453 "r_mbytes_per_sec": 0, 00:18:41.453 "w_mbytes_per_sec": 0 00:18:41.453 }, 00:18:41.453 "claimed": true, 00:18:41.453 "claim_type": "exclusive_write", 00:18:41.453 "zoned": false, 00:18:41.453 "supported_io_types": { 00:18:41.453 "read": true, 00:18:41.453 "write": true, 00:18:41.453 "unmap": true, 00:18:41.453 "write_zeroes": true, 00:18:41.453 "flush": true, 00:18:41.453 "reset": true, 00:18:41.453 "compare": false, 00:18:41.453 "compare_and_write": false, 00:18:41.453 "abort": true, 00:18:41.453 "nvme_admin": false, 00:18:41.453 "nvme_io": false 00:18:41.453 }, 00:18:41.453 "memory_domains": [ 00:18:41.453 { 00:18:41.453 "dma_device_id": "system", 00:18:41.453 "dma_device_type": 1 00:18:41.453 }, 00:18:41.453 { 00:18:41.453 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:41.453 "dma_device_type": 2 00:18:41.453 } 00:18:41.453 ], 00:18:41.453 "driver_specific": {} 00:18:41.453 } 00:18:41.453 ] 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:41.453 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:41.712 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:41.712 "name": "Existed_Raid", 00:18:41.712 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:41.712 "strip_size_kb": 64, 00:18:41.713 "state": "online", 00:18:41.713 "raid_level": "concat", 00:18:41.713 "superblock": true, 00:18:41.713 "num_base_bdevs": 4, 00:18:41.713 "num_base_bdevs_discovered": 4, 00:18:41.713 "num_base_bdevs_operational": 4, 00:18:41.713 "base_bdevs_list": [ 00:18:41.713 { 00:18:41.713 "name": "BaseBdev1", 00:18:41.713 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:41.713 "is_configured": true, 00:18:41.713 "data_offset": 2048, 00:18:41.713 "data_size": 63488 00:18:41.713 }, 00:18:41.713 { 00:18:41.713 "name": "BaseBdev2", 00:18:41.713 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:41.713 "is_configured": true, 00:18:41.713 "data_offset": 2048, 00:18:41.713 "data_size": 63488 00:18:41.713 }, 00:18:41.713 { 00:18:41.713 "name": "BaseBdev3", 00:18:41.713 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:41.713 "is_configured": true, 00:18:41.713 "data_offset": 2048, 00:18:41.713 "data_size": 63488 00:18:41.713 }, 00:18:41.713 { 00:18:41.713 "name": "BaseBdev4", 00:18:41.713 "uuid": "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c", 00:18:41.713 "is_configured": true, 00:18:41.713 "data_offset": 2048, 00:18:41.713 "data_size": 63488 00:18:41.713 } 00:18:41.713 ] 00:18:41.713 }' 00:18:41.713 15:57:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:41.713 15:57:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:18:42.280 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:18:42.280 [2024-06-10 15:57:47.778255] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:18:42.539 "name": "Existed_Raid", 00:18:42.539 "aliases": [ 00:18:42.539 "7ecbf028-8534-4bb5-b80a-adb60517ef95" 00:18:42.539 ], 00:18:42.539 "product_name": "Raid Volume", 00:18:42.539 "block_size": 512, 00:18:42.539 "num_blocks": 253952, 00:18:42.539 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:42.539 "assigned_rate_limits": { 00:18:42.539 "rw_ios_per_sec": 0, 00:18:42.539 "rw_mbytes_per_sec": 0, 00:18:42.539 "r_mbytes_per_sec": 0, 00:18:42.539 "w_mbytes_per_sec": 0 00:18:42.539 }, 00:18:42.539 "claimed": false, 00:18:42.539 "zoned": false, 00:18:42.539 "supported_io_types": { 00:18:42.539 "read": true, 00:18:42.539 "write": true, 00:18:42.539 "unmap": true, 00:18:42.539 "write_zeroes": true, 00:18:42.539 "flush": true, 00:18:42.539 "reset": true, 00:18:42.539 "compare": false, 00:18:42.539 "compare_and_write": false, 00:18:42.539 "abort": false, 00:18:42.539 "nvme_admin": false, 00:18:42.539 "nvme_io": false 00:18:42.539 }, 00:18:42.539 "memory_domains": [ 00:18:42.539 { 00:18:42.539 "dma_device_id": "system", 00:18:42.539 "dma_device_type": 1 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.539 "dma_device_type": 2 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "system", 00:18:42.539 "dma_device_type": 1 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.539 "dma_device_type": 2 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "system", 00:18:42.539 "dma_device_type": 1 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.539 "dma_device_type": 2 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "system", 00:18:42.539 "dma_device_type": 1 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.539 "dma_device_type": 2 00:18:42.539 } 00:18:42.539 ], 00:18:42.539 "driver_specific": { 00:18:42.539 "raid": { 00:18:42.539 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:42.539 "strip_size_kb": 64, 00:18:42.539 "state": "online", 00:18:42.539 "raid_level": "concat", 00:18:42.539 "superblock": true, 00:18:42.539 "num_base_bdevs": 4, 00:18:42.539 "num_base_bdevs_discovered": 4, 00:18:42.539 "num_base_bdevs_operational": 4, 00:18:42.539 "base_bdevs_list": [ 00:18:42.539 { 00:18:42.539 "name": "BaseBdev1", 00:18:42.539 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:42.539 "is_configured": true, 00:18:42.539 "data_offset": 2048, 00:18:42.539 "data_size": 63488 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "name": "BaseBdev2", 00:18:42.539 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:42.539 "is_configured": true, 00:18:42.539 "data_offset": 2048, 00:18:42.539 "data_size": 63488 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "name": "BaseBdev3", 00:18:42.539 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:42.539 "is_configured": true, 00:18:42.539 "data_offset": 2048, 00:18:42.539 "data_size": 63488 00:18:42.539 }, 00:18:42.539 { 00:18:42.539 "name": "BaseBdev4", 00:18:42.539 "uuid": "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c", 00:18:42.539 "is_configured": true, 00:18:42.539 "data_offset": 2048, 00:18:42.539 "data_size": 63488 00:18:42.539 } 00:18:42.539 ] 00:18:42.539 } 00:18:42.539 } 00:18:42.539 }' 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:18:42.539 BaseBdev2 00:18:42.539 BaseBdev3 00:18:42.539 BaseBdev4' 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:18:42.539 15:57:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:42.798 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:42.798 "name": "BaseBdev1", 00:18:42.798 "aliases": [ 00:18:42.798 "399f2dab-6d4e-4e23-802f-ff25afaab908" 00:18:42.798 ], 00:18:42.798 "product_name": "Malloc disk", 00:18:42.798 "block_size": 512, 00:18:42.798 "num_blocks": 65536, 00:18:42.798 "uuid": "399f2dab-6d4e-4e23-802f-ff25afaab908", 00:18:42.798 "assigned_rate_limits": { 00:18:42.798 "rw_ios_per_sec": 0, 00:18:42.798 "rw_mbytes_per_sec": 0, 00:18:42.798 "r_mbytes_per_sec": 0, 00:18:42.798 "w_mbytes_per_sec": 0 00:18:42.798 }, 00:18:42.798 "claimed": true, 00:18:42.798 "claim_type": "exclusive_write", 00:18:42.799 "zoned": false, 00:18:42.799 "supported_io_types": { 00:18:42.799 "read": true, 00:18:42.799 "write": true, 00:18:42.799 "unmap": true, 00:18:42.799 "write_zeroes": true, 00:18:42.799 "flush": true, 00:18:42.799 "reset": true, 00:18:42.799 "compare": false, 00:18:42.799 "compare_and_write": false, 00:18:42.799 "abort": true, 00:18:42.799 "nvme_admin": false, 00:18:42.799 "nvme_io": false 00:18:42.799 }, 00:18:42.799 "memory_domains": [ 00:18:42.799 { 00:18:42.799 "dma_device_id": "system", 00:18:42.799 "dma_device_type": 1 00:18:42.799 }, 00:18:42.799 { 00:18:42.799 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:42.799 "dma_device_type": 2 00:18:42.799 } 00:18:42.799 ], 00:18:42.799 "driver_specific": {} 00:18:42.799 }' 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:42.799 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:18:43.058 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:43.317 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:43.317 "name": "BaseBdev2", 00:18:43.317 "aliases": [ 00:18:43.317 "34d3a591-a94c-4799-a3d5-373f74bfac41" 00:18:43.317 ], 00:18:43.317 "product_name": "Malloc disk", 00:18:43.317 "block_size": 512, 00:18:43.317 "num_blocks": 65536, 00:18:43.317 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:43.317 "assigned_rate_limits": { 00:18:43.317 "rw_ios_per_sec": 0, 00:18:43.317 "rw_mbytes_per_sec": 0, 00:18:43.317 "r_mbytes_per_sec": 0, 00:18:43.317 "w_mbytes_per_sec": 0 00:18:43.317 }, 00:18:43.317 "claimed": true, 00:18:43.317 "claim_type": "exclusive_write", 00:18:43.317 "zoned": false, 00:18:43.317 "supported_io_types": { 00:18:43.317 "read": true, 00:18:43.317 "write": true, 00:18:43.317 "unmap": true, 00:18:43.317 "write_zeroes": true, 00:18:43.318 "flush": true, 00:18:43.318 "reset": true, 00:18:43.318 "compare": false, 00:18:43.318 "compare_and_write": false, 00:18:43.318 "abort": true, 00:18:43.318 "nvme_admin": false, 00:18:43.318 "nvme_io": false 00:18:43.318 }, 00:18:43.318 "memory_domains": [ 00:18:43.318 { 00:18:43.318 "dma_device_id": "system", 00:18:43.318 "dma_device_type": 1 00:18:43.318 }, 00:18:43.318 { 00:18:43.318 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.318 "dma_device_type": 2 00:18:43.318 } 00:18:43.318 ], 00:18:43.318 "driver_specific": {} 00:18:43.318 }' 00:18:43.318 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:43.318 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:43.318 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:43.318 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:43.581 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:43.581 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:43.581 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.581 15:57:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:43.581 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:43.581 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.581 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:43.907 "name": "BaseBdev3", 00:18:43.907 "aliases": [ 00:18:43.907 "b65a4f8c-7400-499d-808a-011e8e89a878" 00:18:43.907 ], 00:18:43.907 "product_name": "Malloc disk", 00:18:43.907 "block_size": 512, 00:18:43.907 "num_blocks": 65536, 00:18:43.907 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:43.907 "assigned_rate_limits": { 00:18:43.907 "rw_ios_per_sec": 0, 00:18:43.907 "rw_mbytes_per_sec": 0, 00:18:43.907 "r_mbytes_per_sec": 0, 00:18:43.907 "w_mbytes_per_sec": 0 00:18:43.907 }, 00:18:43.907 "claimed": true, 00:18:43.907 "claim_type": "exclusive_write", 00:18:43.907 "zoned": false, 00:18:43.907 "supported_io_types": { 00:18:43.907 "read": true, 00:18:43.907 "write": true, 00:18:43.907 "unmap": true, 00:18:43.907 "write_zeroes": true, 00:18:43.907 "flush": true, 00:18:43.907 "reset": true, 00:18:43.907 "compare": false, 00:18:43.907 "compare_and_write": false, 00:18:43.907 "abort": true, 00:18:43.907 "nvme_admin": false, 00:18:43.907 "nvme_io": false 00:18:43.907 }, 00:18:43.907 "memory_domains": [ 00:18:43.907 { 00:18:43.907 "dma_device_id": "system", 00:18:43.907 "dma_device_type": 1 00:18:43.907 }, 00:18:43.907 { 00:18:43.907 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:43.907 "dma_device_type": 2 00:18:43.907 } 00:18:43.907 ], 00:18:43.907 "driver_specific": {} 00:18:43.907 }' 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:43.907 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:44.166 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.425 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.425 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:44.425 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:18:44.425 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:18:44.425 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:18:44.684 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:18:44.684 "name": "BaseBdev4", 00:18:44.684 "aliases": [ 00:18:44.684 "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c" 00:18:44.684 ], 00:18:44.684 "product_name": "Malloc disk", 00:18:44.684 "block_size": 512, 00:18:44.684 "num_blocks": 65536, 00:18:44.684 "uuid": "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c", 00:18:44.684 "assigned_rate_limits": { 00:18:44.684 "rw_ios_per_sec": 0, 00:18:44.684 "rw_mbytes_per_sec": 0, 00:18:44.684 "r_mbytes_per_sec": 0, 00:18:44.684 "w_mbytes_per_sec": 0 00:18:44.684 }, 00:18:44.684 "claimed": true, 00:18:44.684 "claim_type": "exclusive_write", 00:18:44.684 "zoned": false, 00:18:44.684 "supported_io_types": { 00:18:44.684 "read": true, 00:18:44.684 "write": true, 00:18:44.684 "unmap": true, 00:18:44.684 "write_zeroes": true, 00:18:44.684 "flush": true, 00:18:44.684 "reset": true, 00:18:44.684 "compare": false, 00:18:44.684 "compare_and_write": false, 00:18:44.684 "abort": true, 00:18:44.684 "nvme_admin": false, 00:18:44.684 "nvme_io": false 00:18:44.684 }, 00:18:44.684 "memory_domains": [ 00:18:44.684 { 00:18:44.684 "dma_device_id": "system", 00:18:44.684 "dma_device_type": 1 00:18:44.684 }, 00:18:44.684 { 00:18:44.684 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:44.684 "dma_device_type": 2 00:18:44.684 } 00:18:44.684 ], 00:18:44.684 "driver_specific": {} 00:18:44.684 }' 00:18:44.684 15:57:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:18:44.684 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:18:44.942 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:45.202 [2024-06-10 15:57:50.601565] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:45.202 [2024-06-10 15:57:50.601589] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:18:45.202 [2024-06-10 15:57:50.601634] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy concat 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@215 -- # return 1 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@277 -- # expected_state=offline 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid offline concat 64 3 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=offline 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:45.202 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:45.461 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:45.461 "name": "Existed_Raid", 00:18:45.461 "uuid": "7ecbf028-8534-4bb5-b80a-adb60517ef95", 00:18:45.461 "strip_size_kb": 64, 00:18:45.461 "state": "offline", 00:18:45.461 "raid_level": "concat", 00:18:45.461 "superblock": true, 00:18:45.461 "num_base_bdevs": 4, 00:18:45.461 "num_base_bdevs_discovered": 3, 00:18:45.461 "num_base_bdevs_operational": 3, 00:18:45.461 "base_bdevs_list": [ 00:18:45.461 { 00:18:45.461 "name": null, 00:18:45.461 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:45.461 "is_configured": false, 00:18:45.461 "data_offset": 2048, 00:18:45.461 "data_size": 63488 00:18:45.461 }, 00:18:45.461 { 00:18:45.461 "name": "BaseBdev2", 00:18:45.461 "uuid": "34d3a591-a94c-4799-a3d5-373f74bfac41", 00:18:45.461 "is_configured": true, 00:18:45.461 "data_offset": 2048, 00:18:45.461 "data_size": 63488 00:18:45.461 }, 00:18:45.461 { 00:18:45.461 "name": "BaseBdev3", 00:18:45.461 "uuid": "b65a4f8c-7400-499d-808a-011e8e89a878", 00:18:45.461 "is_configured": true, 00:18:45.461 "data_offset": 2048, 00:18:45.461 "data_size": 63488 00:18:45.461 }, 00:18:45.461 { 00:18:45.461 "name": "BaseBdev4", 00:18:45.461 "uuid": "a88ec11d-c8e3-4fac-9fb8-b43bc27eda4c", 00:18:45.461 "is_configured": true, 00:18:45.461 "data_offset": 2048, 00:18:45.461 "data_size": 63488 00:18:45.461 } 00:18:45.461 ] 00:18:45.461 }' 00:18:45.461 15:57:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:45.461 15:57:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:46.029 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:18:46.029 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:46.029 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.029 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:46.288 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:46.288 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:46.288 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:18:46.288 [2024-06-10 15:57:51.797881] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:46.546 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:46.546 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:46.546 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:46.547 15:57:51 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:46.804 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:46.804 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:46.804 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:18:47.061 [2024-06-10 15:57:52.329541] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:47.061 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:47.061 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:47.062 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.062 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:18:47.319 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:18:47.319 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:18:47.319 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:18:47.577 [2024-06-10 15:57:52.849330] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:18:47.577 [2024-06-10 15:57:52.849370] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ad8ac0 name Existed_Raid, state offline 00:18:47.577 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:18:47.577 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:18:47.577 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:47.577 15:57:52 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:47.836 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:18:48.095 BaseBdev2 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:48.095 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:48.354 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:18:48.612 [ 00:18:48.612 { 00:18:48.612 "name": "BaseBdev2", 00:18:48.612 "aliases": [ 00:18:48.612 "1402f676-db33-4b05-bf07-bfe96bf121d0" 00:18:48.612 ], 00:18:48.612 "product_name": "Malloc disk", 00:18:48.612 "block_size": 512, 00:18:48.612 "num_blocks": 65536, 00:18:48.612 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:48.612 "assigned_rate_limits": { 00:18:48.612 "rw_ios_per_sec": 0, 00:18:48.612 "rw_mbytes_per_sec": 0, 00:18:48.612 "r_mbytes_per_sec": 0, 00:18:48.612 "w_mbytes_per_sec": 0 00:18:48.612 }, 00:18:48.612 "claimed": false, 00:18:48.612 "zoned": false, 00:18:48.612 "supported_io_types": { 00:18:48.612 "read": true, 00:18:48.612 "write": true, 00:18:48.612 "unmap": true, 00:18:48.612 "write_zeroes": true, 00:18:48.612 "flush": true, 00:18:48.612 "reset": true, 00:18:48.612 "compare": false, 00:18:48.612 "compare_and_write": false, 00:18:48.612 "abort": true, 00:18:48.612 "nvme_admin": false, 00:18:48.612 "nvme_io": false 00:18:48.612 }, 00:18:48.612 "memory_domains": [ 00:18:48.612 { 00:18:48.612 "dma_device_id": "system", 00:18:48.612 "dma_device_type": 1 00:18:48.612 }, 00:18:48.612 { 00:18:48.612 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:48.612 "dma_device_type": 2 00:18:48.612 } 00:18:48.612 ], 00:18:48.612 "driver_specific": {} 00:18:48.612 } 00:18:48.612 ] 00:18:48.612 15:57:53 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:48.612 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:48.612 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:48.612 15:57:53 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:18:48.870 BaseBdev3 00:18:48.870 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:18:48.870 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:18:48.871 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:48.871 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:48.871 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:48.871 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:48.871 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:49.129 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:18:49.387 [ 00:18:49.387 { 00:18:49.387 "name": "BaseBdev3", 00:18:49.387 "aliases": [ 00:18:49.387 "4acce782-eac1-4f10-bf5d-25f8edc86e87" 00:18:49.387 ], 00:18:49.387 "product_name": "Malloc disk", 00:18:49.387 "block_size": 512, 00:18:49.387 "num_blocks": 65536, 00:18:49.387 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:49.387 "assigned_rate_limits": { 00:18:49.387 "rw_ios_per_sec": 0, 00:18:49.387 "rw_mbytes_per_sec": 0, 00:18:49.387 "r_mbytes_per_sec": 0, 00:18:49.387 "w_mbytes_per_sec": 0 00:18:49.387 }, 00:18:49.387 "claimed": false, 00:18:49.387 "zoned": false, 00:18:49.387 "supported_io_types": { 00:18:49.387 "read": true, 00:18:49.387 "write": true, 00:18:49.387 "unmap": true, 00:18:49.387 "write_zeroes": true, 00:18:49.387 "flush": true, 00:18:49.387 "reset": true, 00:18:49.387 "compare": false, 00:18:49.387 "compare_and_write": false, 00:18:49.387 "abort": true, 00:18:49.387 "nvme_admin": false, 00:18:49.387 "nvme_io": false 00:18:49.387 }, 00:18:49.387 "memory_domains": [ 00:18:49.387 { 00:18:49.387 "dma_device_id": "system", 00:18:49.387 "dma_device_type": 1 00:18:49.387 }, 00:18:49.387 { 00:18:49.387 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:49.387 "dma_device_type": 2 00:18:49.387 } 00:18:49.387 ], 00:18:49.387 "driver_specific": {} 00:18:49.387 } 00:18:49.387 ] 00:18:49.387 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:49.387 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:49.387 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:49.387 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:18:49.646 BaseBdev4 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:49.646 15:57:54 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:49.904 15:57:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:18:50.162 [ 00:18:50.162 { 00:18:50.162 "name": "BaseBdev4", 00:18:50.162 "aliases": [ 00:18:50.162 "060ab9a3-8402-43a4-818b-d5ccd40a54e3" 00:18:50.162 ], 00:18:50.162 "product_name": "Malloc disk", 00:18:50.162 "block_size": 512, 00:18:50.162 "num_blocks": 65536, 00:18:50.162 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:50.163 "assigned_rate_limits": { 00:18:50.163 "rw_ios_per_sec": 0, 00:18:50.163 "rw_mbytes_per_sec": 0, 00:18:50.163 "r_mbytes_per_sec": 0, 00:18:50.163 "w_mbytes_per_sec": 0 00:18:50.163 }, 00:18:50.163 "claimed": false, 00:18:50.163 "zoned": false, 00:18:50.163 "supported_io_types": { 00:18:50.163 "read": true, 00:18:50.163 "write": true, 00:18:50.163 "unmap": true, 00:18:50.163 "write_zeroes": true, 00:18:50.163 "flush": true, 00:18:50.163 "reset": true, 00:18:50.163 "compare": false, 00:18:50.163 "compare_and_write": false, 00:18:50.163 "abort": true, 00:18:50.163 "nvme_admin": false, 00:18:50.163 "nvme_io": false 00:18:50.163 }, 00:18:50.163 "memory_domains": [ 00:18:50.163 { 00:18:50.163 "dma_device_id": "system", 00:18:50.163 "dma_device_type": 1 00:18:50.163 }, 00:18:50.163 { 00:18:50.163 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:50.163 "dma_device_type": 2 00:18:50.163 } 00:18:50.163 ], 00:18:50.163 "driver_specific": {} 00:18:50.163 } 00:18:50.163 ] 00:18:50.163 15:57:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:50.163 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:18:50.163 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:18:50.163 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -s -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:18:50.163 [2024-06-10 15:57:55.668528] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:18:50.163 [2024-06-10 15:57:55.668566] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:18:50.163 [2024-06-10 15:57:55.668583] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:50.163 [2024-06-10 15:57:55.670000] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:50.163 [2024-06-10 15:57:55.670047] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:18:50.421 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:50.421 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:50.421 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:50.421 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:50.421 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:50.422 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:50.680 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:50.680 "name": "Existed_Raid", 00:18:50.680 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:50.680 "strip_size_kb": 64, 00:18:50.680 "state": "configuring", 00:18:50.680 "raid_level": "concat", 00:18:50.680 "superblock": true, 00:18:50.680 "num_base_bdevs": 4, 00:18:50.680 "num_base_bdevs_discovered": 3, 00:18:50.680 "num_base_bdevs_operational": 4, 00:18:50.680 "base_bdevs_list": [ 00:18:50.680 { 00:18:50.680 "name": "BaseBdev1", 00:18:50.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:50.680 "is_configured": false, 00:18:50.680 "data_offset": 0, 00:18:50.680 "data_size": 0 00:18:50.680 }, 00:18:50.680 { 00:18:50.680 "name": "BaseBdev2", 00:18:50.680 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:50.680 "is_configured": true, 00:18:50.680 "data_offset": 2048, 00:18:50.680 "data_size": 63488 00:18:50.680 }, 00:18:50.680 { 00:18:50.680 "name": "BaseBdev3", 00:18:50.680 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:50.680 "is_configured": true, 00:18:50.680 "data_offset": 2048, 00:18:50.680 "data_size": 63488 00:18:50.680 }, 00:18:50.680 { 00:18:50.680 "name": "BaseBdev4", 00:18:50.680 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:50.680 "is_configured": true, 00:18:50.680 "data_offset": 2048, 00:18:50.680 "data_size": 63488 00:18:50.680 } 00:18:50.680 ] 00:18:50.680 }' 00:18:50.680 15:57:55 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:50.680 15:57:55 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:18:51.247 [2024-06-10 15:57:56.723313] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:51.247 15:57:56 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:51.506 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:51.506 "name": "Existed_Raid", 00:18:51.506 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:51.506 "strip_size_kb": 64, 00:18:51.506 "state": "configuring", 00:18:51.506 "raid_level": "concat", 00:18:51.506 "superblock": true, 00:18:51.506 "num_base_bdevs": 4, 00:18:51.506 "num_base_bdevs_discovered": 2, 00:18:51.506 "num_base_bdevs_operational": 4, 00:18:51.506 "base_bdevs_list": [ 00:18:51.506 { 00:18:51.506 "name": "BaseBdev1", 00:18:51.506 "uuid": "00000000-0000-0000-0000-000000000000", 00:18:51.506 "is_configured": false, 00:18:51.506 "data_offset": 0, 00:18:51.506 "data_size": 0 00:18:51.506 }, 00:18:51.506 { 00:18:51.506 "name": null, 00:18:51.506 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:51.506 "is_configured": false, 00:18:51.506 "data_offset": 2048, 00:18:51.506 "data_size": 63488 00:18:51.506 }, 00:18:51.506 { 00:18:51.506 "name": "BaseBdev3", 00:18:51.506 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:51.506 "is_configured": true, 00:18:51.506 "data_offset": 2048, 00:18:51.506 "data_size": 63488 00:18:51.506 }, 00:18:51.506 { 00:18:51.506 "name": "BaseBdev4", 00:18:51.506 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:51.506 "is_configured": true, 00:18:51.506 "data_offset": 2048, 00:18:51.506 "data_size": 63488 00:18:51.506 } 00:18:51.506 ] 00:18:51.506 }' 00:18:51.506 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:51.506 15:57:57 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:52.441 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:52.441 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:52.441 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:18:52.441 15:57:57 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:18:52.700 [2024-06-10 15:57:58.122354] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:18:52.700 BaseBdev1 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:18:52.700 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:18:52.958 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:18:53.216 [ 00:18:53.216 { 00:18:53.216 "name": "BaseBdev1", 00:18:53.216 "aliases": [ 00:18:53.216 "dcfded11-411c-421e-859f-15aa5362b23f" 00:18:53.216 ], 00:18:53.216 "product_name": "Malloc disk", 00:18:53.216 "block_size": 512, 00:18:53.216 "num_blocks": 65536, 00:18:53.216 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:53.216 "assigned_rate_limits": { 00:18:53.216 "rw_ios_per_sec": 0, 00:18:53.216 "rw_mbytes_per_sec": 0, 00:18:53.216 "r_mbytes_per_sec": 0, 00:18:53.216 "w_mbytes_per_sec": 0 00:18:53.216 }, 00:18:53.216 "claimed": true, 00:18:53.216 "claim_type": "exclusive_write", 00:18:53.216 "zoned": false, 00:18:53.216 "supported_io_types": { 00:18:53.216 "read": true, 00:18:53.216 "write": true, 00:18:53.216 "unmap": true, 00:18:53.216 "write_zeroes": true, 00:18:53.216 "flush": true, 00:18:53.216 "reset": true, 00:18:53.216 "compare": false, 00:18:53.216 "compare_and_write": false, 00:18:53.216 "abort": true, 00:18:53.216 "nvme_admin": false, 00:18:53.216 "nvme_io": false 00:18:53.216 }, 00:18:53.216 "memory_domains": [ 00:18:53.216 { 00:18:53.216 "dma_device_id": "system", 00:18:53.216 "dma_device_type": 1 00:18:53.216 }, 00:18:53.216 { 00:18:53.216 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:18:53.216 "dma_device_type": 2 00:18:53.216 } 00:18:53.216 ], 00:18:53.216 "driver_specific": {} 00:18:53.217 } 00:18:53.217 ] 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:53.217 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:53.476 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:53.476 "name": "Existed_Raid", 00:18:53.476 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:53.476 "strip_size_kb": 64, 00:18:53.476 "state": "configuring", 00:18:53.476 "raid_level": "concat", 00:18:53.476 "superblock": true, 00:18:53.476 "num_base_bdevs": 4, 00:18:53.476 "num_base_bdevs_discovered": 3, 00:18:53.476 "num_base_bdevs_operational": 4, 00:18:53.476 "base_bdevs_list": [ 00:18:53.476 { 00:18:53.476 "name": "BaseBdev1", 00:18:53.476 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:53.476 "is_configured": true, 00:18:53.476 "data_offset": 2048, 00:18:53.476 "data_size": 63488 00:18:53.476 }, 00:18:53.476 { 00:18:53.476 "name": null, 00:18:53.476 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:53.476 "is_configured": false, 00:18:53.476 "data_offset": 2048, 00:18:53.476 "data_size": 63488 00:18:53.476 }, 00:18:53.476 { 00:18:53.476 "name": "BaseBdev3", 00:18:53.476 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:53.476 "is_configured": true, 00:18:53.476 "data_offset": 2048, 00:18:53.476 "data_size": 63488 00:18:53.476 }, 00:18:53.476 { 00:18:53.476 "name": "BaseBdev4", 00:18:53.476 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:53.476 "is_configured": true, 00:18:53.476 "data_offset": 2048, 00:18:53.476 "data_size": 63488 00:18:53.476 } 00:18:53.476 ] 00:18:53.476 }' 00:18:53.476 15:57:58 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:53.476 15:57:58 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:54.043 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.043 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:54.302 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:18:54.302 15:57:59 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:18:54.560 [2024-06-10 15:58:00.019458] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:54.560 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:54.819 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:54.819 "name": "Existed_Raid", 00:18:54.819 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:54.819 "strip_size_kb": 64, 00:18:54.819 "state": "configuring", 00:18:54.819 "raid_level": "concat", 00:18:54.819 "superblock": true, 00:18:54.819 "num_base_bdevs": 4, 00:18:54.819 "num_base_bdevs_discovered": 2, 00:18:54.819 "num_base_bdevs_operational": 4, 00:18:54.819 "base_bdevs_list": [ 00:18:54.819 { 00:18:54.819 "name": "BaseBdev1", 00:18:54.819 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:54.819 "is_configured": true, 00:18:54.819 "data_offset": 2048, 00:18:54.819 "data_size": 63488 00:18:54.819 }, 00:18:54.819 { 00:18:54.819 "name": null, 00:18:54.819 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:54.819 "is_configured": false, 00:18:54.819 "data_offset": 2048, 00:18:54.819 "data_size": 63488 00:18:54.819 }, 00:18:54.819 { 00:18:54.819 "name": null, 00:18:54.819 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:54.819 "is_configured": false, 00:18:54.819 "data_offset": 2048, 00:18:54.819 "data_size": 63488 00:18:54.819 }, 00:18:54.819 { 00:18:54.819 "name": "BaseBdev4", 00:18:54.819 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:54.819 "is_configured": true, 00:18:54.819 "data_offset": 2048, 00:18:54.819 "data_size": 63488 00:18:54.819 } 00:18:54.819 ] 00:18:54.819 }' 00:18:54.819 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:54.819 15:58:00 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:55.755 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:55.755 15:58:00 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:55.755 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:18:55.755 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:18:56.014 [2024-06-10 15:58:01.419220] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.014 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:56.273 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:56.273 "name": "Existed_Raid", 00:18:56.273 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:56.273 "strip_size_kb": 64, 00:18:56.273 "state": "configuring", 00:18:56.273 "raid_level": "concat", 00:18:56.273 "superblock": true, 00:18:56.273 "num_base_bdevs": 4, 00:18:56.273 "num_base_bdevs_discovered": 3, 00:18:56.273 "num_base_bdevs_operational": 4, 00:18:56.273 "base_bdevs_list": [ 00:18:56.273 { 00:18:56.273 "name": "BaseBdev1", 00:18:56.273 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:56.273 "is_configured": true, 00:18:56.273 "data_offset": 2048, 00:18:56.273 "data_size": 63488 00:18:56.273 }, 00:18:56.273 { 00:18:56.273 "name": null, 00:18:56.273 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:56.273 "is_configured": false, 00:18:56.273 "data_offset": 2048, 00:18:56.273 "data_size": 63488 00:18:56.273 }, 00:18:56.273 { 00:18:56.273 "name": "BaseBdev3", 00:18:56.273 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:56.273 "is_configured": true, 00:18:56.273 "data_offset": 2048, 00:18:56.273 "data_size": 63488 00:18:56.273 }, 00:18:56.273 { 00:18:56.273 "name": "BaseBdev4", 00:18:56.273 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:56.273 "is_configured": true, 00:18:56.273 "data_offset": 2048, 00:18:56.273 "data_size": 63488 00:18:56.273 } 00:18:56.273 ] 00:18:56.273 }' 00:18:56.273 15:58:01 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:56.273 15:58:01 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:56.840 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:56.840 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:18:57.098 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:18:57.098 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:18:57.356 [2024-06-10 15:58:02.822993] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:57.356 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:57.357 15:58:02 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:57.615 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:57.615 "name": "Existed_Raid", 00:18:57.615 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:57.615 "strip_size_kb": 64, 00:18:57.615 "state": "configuring", 00:18:57.615 "raid_level": "concat", 00:18:57.615 "superblock": true, 00:18:57.615 "num_base_bdevs": 4, 00:18:57.615 "num_base_bdevs_discovered": 2, 00:18:57.615 "num_base_bdevs_operational": 4, 00:18:57.615 "base_bdevs_list": [ 00:18:57.615 { 00:18:57.615 "name": null, 00:18:57.615 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:57.615 "is_configured": false, 00:18:57.615 "data_offset": 2048, 00:18:57.615 "data_size": 63488 00:18:57.615 }, 00:18:57.615 { 00:18:57.615 "name": null, 00:18:57.615 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:57.615 "is_configured": false, 00:18:57.615 "data_offset": 2048, 00:18:57.615 "data_size": 63488 00:18:57.615 }, 00:18:57.615 { 00:18:57.615 "name": "BaseBdev3", 00:18:57.615 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:57.615 "is_configured": true, 00:18:57.615 "data_offset": 2048, 00:18:57.615 "data_size": 63488 00:18:57.615 }, 00:18:57.615 { 00:18:57.615 "name": "BaseBdev4", 00:18:57.615 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:57.615 "is_configured": true, 00:18:57.615 "data_offset": 2048, 00:18:57.615 "data_size": 63488 00:18:57.615 } 00:18:57.615 ] 00:18:57.615 }' 00:18:57.615 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:57.615 15:58:03 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:58.553 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.553 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:18:58.553 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:18:58.553 15:58:03 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:18:58.812 [2024-06-10 15:58:04.156704] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring concat 64 4 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:58.812 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:18:59.072 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:18:59.072 "name": "Existed_Raid", 00:18:59.072 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:18:59.072 "strip_size_kb": 64, 00:18:59.072 "state": "configuring", 00:18:59.072 "raid_level": "concat", 00:18:59.072 "superblock": true, 00:18:59.072 "num_base_bdevs": 4, 00:18:59.072 "num_base_bdevs_discovered": 3, 00:18:59.072 "num_base_bdevs_operational": 4, 00:18:59.072 "base_bdevs_list": [ 00:18:59.072 { 00:18:59.072 "name": null, 00:18:59.072 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:18:59.072 "is_configured": false, 00:18:59.072 "data_offset": 2048, 00:18:59.072 "data_size": 63488 00:18:59.072 }, 00:18:59.072 { 00:18:59.072 "name": "BaseBdev2", 00:18:59.072 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:18:59.072 "is_configured": true, 00:18:59.072 "data_offset": 2048, 00:18:59.072 "data_size": 63488 00:18:59.072 }, 00:18:59.072 { 00:18:59.072 "name": "BaseBdev3", 00:18:59.072 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:18:59.072 "is_configured": true, 00:18:59.072 "data_offset": 2048, 00:18:59.072 "data_size": 63488 00:18:59.072 }, 00:18:59.072 { 00:18:59.072 "name": "BaseBdev4", 00:18:59.072 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:18:59.072 "is_configured": true, 00:18:59.072 "data_offset": 2048, 00:18:59.072 "data_size": 63488 00:18:59.072 } 00:18:59.072 ] 00:18:59.072 }' 00:18:59.072 15:58:04 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:18:59.072 15:58:04 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:18:59.640 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.640 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:18:59.898 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:18:59.898 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:18:59.898 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:19:00.157 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u dcfded11-411c-421e-859f-15aa5362b23f 00:19:00.415 [2024-06-10 15:58:05.804472] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:19:00.415 [2024-06-10 15:58:05.804626] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c7e3a0 00:19:00.415 [2024-06-10 15:58:05.804638] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:00.415 [2024-06-10 15:58:05.804817] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c7f180 00:19:00.415 [2024-06-10 15:58:05.804934] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c7e3a0 00:19:00.415 [2024-06-10 15:58:05.804942] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x1c7e3a0 00:19:00.415 [2024-06-10 15:58:05.805045] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:00.415 NewBaseBdev 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:00.415 15:58:05 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:00.675 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:19:00.994 [ 00:19:00.994 { 00:19:00.994 "name": "NewBaseBdev", 00:19:00.994 "aliases": [ 00:19:00.994 "dcfded11-411c-421e-859f-15aa5362b23f" 00:19:00.994 ], 00:19:00.994 "product_name": "Malloc disk", 00:19:00.994 "block_size": 512, 00:19:00.994 "num_blocks": 65536, 00:19:00.994 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:19:00.994 "assigned_rate_limits": { 00:19:00.994 "rw_ios_per_sec": 0, 00:19:00.994 "rw_mbytes_per_sec": 0, 00:19:00.994 "r_mbytes_per_sec": 0, 00:19:00.994 "w_mbytes_per_sec": 0 00:19:00.994 }, 00:19:00.994 "claimed": true, 00:19:00.994 "claim_type": "exclusive_write", 00:19:00.994 "zoned": false, 00:19:00.994 "supported_io_types": { 00:19:00.994 "read": true, 00:19:00.994 "write": true, 00:19:00.994 "unmap": true, 00:19:00.994 "write_zeroes": true, 00:19:00.994 "flush": true, 00:19:00.994 "reset": true, 00:19:00.994 "compare": false, 00:19:00.994 "compare_and_write": false, 00:19:00.994 "abort": true, 00:19:00.994 "nvme_admin": false, 00:19:00.994 "nvme_io": false 00:19:00.994 }, 00:19:00.994 "memory_domains": [ 00:19:00.994 { 00:19:00.994 "dma_device_id": "system", 00:19:00.994 "dma_device_type": 1 00:19:00.994 }, 00:19:00.994 { 00:19:00.994 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:00.994 "dma_device_type": 2 00:19:00.994 } 00:19:00.994 ], 00:19:00.994 "driver_specific": {} 00:19:00.994 } 00:19:00.994 ] 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online concat 64 4 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:00.994 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:01.253 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:01.253 "name": "Existed_Raid", 00:19:01.253 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:19:01.253 "strip_size_kb": 64, 00:19:01.253 "state": "online", 00:19:01.253 "raid_level": "concat", 00:19:01.253 "superblock": true, 00:19:01.253 "num_base_bdevs": 4, 00:19:01.253 "num_base_bdevs_discovered": 4, 00:19:01.253 "num_base_bdevs_operational": 4, 00:19:01.253 "base_bdevs_list": [ 00:19:01.253 { 00:19:01.253 "name": "NewBaseBdev", 00:19:01.253 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:19:01.253 "is_configured": true, 00:19:01.253 "data_offset": 2048, 00:19:01.253 "data_size": 63488 00:19:01.253 }, 00:19:01.253 { 00:19:01.253 "name": "BaseBdev2", 00:19:01.253 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:19:01.253 "is_configured": true, 00:19:01.253 "data_offset": 2048, 00:19:01.253 "data_size": 63488 00:19:01.253 }, 00:19:01.253 { 00:19:01.253 "name": "BaseBdev3", 00:19:01.253 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:19:01.253 "is_configured": true, 00:19:01.253 "data_offset": 2048, 00:19:01.253 "data_size": 63488 00:19:01.253 }, 00:19:01.253 { 00:19:01.253 "name": "BaseBdev4", 00:19:01.253 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:19:01.253 "is_configured": true, 00:19:01.253 "data_offset": 2048, 00:19:01.253 "data_size": 63488 00:19:01.253 } 00:19:01.253 ] 00:19:01.253 }' 00:19:01.253 15:58:06 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:01.253 15:58:06 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:01.820 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:19:01.820 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:01.820 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:01.821 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:01.821 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:01.821 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:19:01.821 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:01.821 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:02.080 [2024-06-10 15:58:07.405068] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:02.080 "name": "Existed_Raid", 00:19:02.080 "aliases": [ 00:19:02.080 "6ea51e85-0874-460b-95da-55f89139b0b4" 00:19:02.080 ], 00:19:02.080 "product_name": "Raid Volume", 00:19:02.080 "block_size": 512, 00:19:02.080 "num_blocks": 253952, 00:19:02.080 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:19:02.080 "assigned_rate_limits": { 00:19:02.080 "rw_ios_per_sec": 0, 00:19:02.080 "rw_mbytes_per_sec": 0, 00:19:02.080 "r_mbytes_per_sec": 0, 00:19:02.080 "w_mbytes_per_sec": 0 00:19:02.080 }, 00:19:02.080 "claimed": false, 00:19:02.080 "zoned": false, 00:19:02.080 "supported_io_types": { 00:19:02.080 "read": true, 00:19:02.080 "write": true, 00:19:02.080 "unmap": true, 00:19:02.080 "write_zeroes": true, 00:19:02.080 "flush": true, 00:19:02.080 "reset": true, 00:19:02.080 "compare": false, 00:19:02.080 "compare_and_write": false, 00:19:02.080 "abort": false, 00:19:02.080 "nvme_admin": false, 00:19:02.080 "nvme_io": false 00:19:02.080 }, 00:19:02.080 "memory_domains": [ 00:19:02.080 { 00:19:02.080 "dma_device_id": "system", 00:19:02.080 "dma_device_type": 1 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.080 "dma_device_type": 2 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "system", 00:19:02.080 "dma_device_type": 1 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.080 "dma_device_type": 2 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "system", 00:19:02.080 "dma_device_type": 1 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.080 "dma_device_type": 2 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "system", 00:19:02.080 "dma_device_type": 1 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.080 "dma_device_type": 2 00:19:02.080 } 00:19:02.080 ], 00:19:02.080 "driver_specific": { 00:19:02.080 "raid": { 00:19:02.080 "uuid": "6ea51e85-0874-460b-95da-55f89139b0b4", 00:19:02.080 "strip_size_kb": 64, 00:19:02.080 "state": "online", 00:19:02.080 "raid_level": "concat", 00:19:02.080 "superblock": true, 00:19:02.080 "num_base_bdevs": 4, 00:19:02.080 "num_base_bdevs_discovered": 4, 00:19:02.080 "num_base_bdevs_operational": 4, 00:19:02.080 "base_bdevs_list": [ 00:19:02.080 { 00:19:02.080 "name": "NewBaseBdev", 00:19:02.080 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:19:02.080 "is_configured": true, 00:19:02.080 "data_offset": 2048, 00:19:02.080 "data_size": 63488 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "name": "BaseBdev2", 00:19:02.080 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:19:02.080 "is_configured": true, 00:19:02.080 "data_offset": 2048, 00:19:02.080 "data_size": 63488 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "name": "BaseBdev3", 00:19:02.080 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:19:02.080 "is_configured": true, 00:19:02.080 "data_offset": 2048, 00:19:02.080 "data_size": 63488 00:19:02.080 }, 00:19:02.080 { 00:19:02.080 "name": "BaseBdev4", 00:19:02.080 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:19:02.080 "is_configured": true, 00:19:02.080 "data_offset": 2048, 00:19:02.080 "data_size": 63488 00:19:02.080 } 00:19:02.080 ] 00:19:02.080 } 00:19:02.080 } 00:19:02.080 }' 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:19:02.080 BaseBdev2 00:19:02.080 BaseBdev3 00:19:02.080 BaseBdev4' 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:19:02.080 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.339 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.339 "name": "NewBaseBdev", 00:19:02.339 "aliases": [ 00:19:02.339 "dcfded11-411c-421e-859f-15aa5362b23f" 00:19:02.339 ], 00:19:02.339 "product_name": "Malloc disk", 00:19:02.339 "block_size": 512, 00:19:02.339 "num_blocks": 65536, 00:19:02.339 "uuid": "dcfded11-411c-421e-859f-15aa5362b23f", 00:19:02.339 "assigned_rate_limits": { 00:19:02.339 "rw_ios_per_sec": 0, 00:19:02.339 "rw_mbytes_per_sec": 0, 00:19:02.339 "r_mbytes_per_sec": 0, 00:19:02.339 "w_mbytes_per_sec": 0 00:19:02.339 }, 00:19:02.339 "claimed": true, 00:19:02.339 "claim_type": "exclusive_write", 00:19:02.339 "zoned": false, 00:19:02.339 "supported_io_types": { 00:19:02.339 "read": true, 00:19:02.339 "write": true, 00:19:02.339 "unmap": true, 00:19:02.339 "write_zeroes": true, 00:19:02.339 "flush": true, 00:19:02.339 "reset": true, 00:19:02.339 "compare": false, 00:19:02.339 "compare_and_write": false, 00:19:02.339 "abort": true, 00:19:02.339 "nvme_admin": false, 00:19:02.339 "nvme_io": false 00:19:02.339 }, 00:19:02.339 "memory_domains": [ 00:19:02.339 { 00:19:02.339 "dma_device_id": "system", 00:19:02.339 "dma_device_type": 1 00:19:02.339 }, 00:19:02.339 { 00:19:02.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.339 "dma_device_type": 2 00:19:02.339 } 00:19:02.339 ], 00:19:02.339 "driver_specific": {} 00:19:02.339 }' 00:19:02.339 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.339 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:02.339 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:02.339 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.598 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:02.598 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:02.598 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.598 15:58:07 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:02.598 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:02.857 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:02.857 "name": "BaseBdev2", 00:19:02.857 "aliases": [ 00:19:02.857 "1402f676-db33-4b05-bf07-bfe96bf121d0" 00:19:02.857 ], 00:19:02.857 "product_name": "Malloc disk", 00:19:02.857 "block_size": 512, 00:19:02.857 "num_blocks": 65536, 00:19:02.857 "uuid": "1402f676-db33-4b05-bf07-bfe96bf121d0", 00:19:02.857 "assigned_rate_limits": { 00:19:02.857 "rw_ios_per_sec": 0, 00:19:02.857 "rw_mbytes_per_sec": 0, 00:19:02.857 "r_mbytes_per_sec": 0, 00:19:02.857 "w_mbytes_per_sec": 0 00:19:02.857 }, 00:19:02.857 "claimed": true, 00:19:02.857 "claim_type": "exclusive_write", 00:19:02.857 "zoned": false, 00:19:02.857 "supported_io_types": { 00:19:02.857 "read": true, 00:19:02.857 "write": true, 00:19:02.857 "unmap": true, 00:19:02.857 "write_zeroes": true, 00:19:02.857 "flush": true, 00:19:02.857 "reset": true, 00:19:02.857 "compare": false, 00:19:02.857 "compare_and_write": false, 00:19:02.857 "abort": true, 00:19:02.857 "nvme_admin": false, 00:19:02.857 "nvme_io": false 00:19:02.857 }, 00:19:02.857 "memory_domains": [ 00:19:02.857 { 00:19:02.857 "dma_device_id": "system", 00:19:02.857 "dma_device_type": 1 00:19:02.857 }, 00:19:02.857 { 00:19:02.857 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:02.857 "dma_device_type": 2 00:19:02.857 } 00:19:02.857 ], 00:19:02.857 "driver_specific": {} 00:19:02.857 }' 00:19:02.857 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.117 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.376 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.376 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.376 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.376 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:03.376 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:03.635 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:03.635 "name": "BaseBdev3", 00:19:03.635 "aliases": [ 00:19:03.635 "4acce782-eac1-4f10-bf5d-25f8edc86e87" 00:19:03.635 ], 00:19:03.635 "product_name": "Malloc disk", 00:19:03.635 "block_size": 512, 00:19:03.635 "num_blocks": 65536, 00:19:03.635 "uuid": "4acce782-eac1-4f10-bf5d-25f8edc86e87", 00:19:03.635 "assigned_rate_limits": { 00:19:03.635 "rw_ios_per_sec": 0, 00:19:03.635 "rw_mbytes_per_sec": 0, 00:19:03.635 "r_mbytes_per_sec": 0, 00:19:03.635 "w_mbytes_per_sec": 0 00:19:03.635 }, 00:19:03.635 "claimed": true, 00:19:03.635 "claim_type": "exclusive_write", 00:19:03.635 "zoned": false, 00:19:03.635 "supported_io_types": { 00:19:03.635 "read": true, 00:19:03.635 "write": true, 00:19:03.635 "unmap": true, 00:19:03.635 "write_zeroes": true, 00:19:03.635 "flush": true, 00:19:03.635 "reset": true, 00:19:03.635 "compare": false, 00:19:03.635 "compare_and_write": false, 00:19:03.635 "abort": true, 00:19:03.635 "nvme_admin": false, 00:19:03.635 "nvme_io": false 00:19:03.635 }, 00:19:03.635 "memory_domains": [ 00:19:03.635 { 00:19:03.635 "dma_device_id": "system", 00:19:03.635 "dma_device_type": 1 00:19:03.635 }, 00:19:03.635 { 00:19:03.635 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:03.635 "dma_device_type": 2 00:19:03.635 } 00:19:03.635 ], 00:19:03.635 "driver_specific": {} 00:19:03.635 }' 00:19:03.635 15:58:08 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.635 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:03.635 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:03.635 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.635 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:03.894 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:04.153 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:04.153 "name": "BaseBdev4", 00:19:04.153 "aliases": [ 00:19:04.153 "060ab9a3-8402-43a4-818b-d5ccd40a54e3" 00:19:04.153 ], 00:19:04.153 "product_name": "Malloc disk", 00:19:04.153 "block_size": 512, 00:19:04.153 "num_blocks": 65536, 00:19:04.153 "uuid": "060ab9a3-8402-43a4-818b-d5ccd40a54e3", 00:19:04.153 "assigned_rate_limits": { 00:19:04.153 "rw_ios_per_sec": 0, 00:19:04.153 "rw_mbytes_per_sec": 0, 00:19:04.153 "r_mbytes_per_sec": 0, 00:19:04.153 "w_mbytes_per_sec": 0 00:19:04.153 }, 00:19:04.153 "claimed": true, 00:19:04.153 "claim_type": "exclusive_write", 00:19:04.153 "zoned": false, 00:19:04.153 "supported_io_types": { 00:19:04.153 "read": true, 00:19:04.153 "write": true, 00:19:04.153 "unmap": true, 00:19:04.153 "write_zeroes": true, 00:19:04.153 "flush": true, 00:19:04.153 "reset": true, 00:19:04.153 "compare": false, 00:19:04.153 "compare_and_write": false, 00:19:04.153 "abort": true, 00:19:04.153 "nvme_admin": false, 00:19:04.153 "nvme_io": false 00:19:04.153 }, 00:19:04.153 "memory_domains": [ 00:19:04.153 { 00:19:04.153 "dma_device_id": "system", 00:19:04.153 "dma_device_type": 1 00:19:04.153 }, 00:19:04.153 { 00:19:04.153 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:04.153 "dma_device_type": 2 00:19:04.153 } 00:19:04.153 ], 00:19:04.153 "driver_specific": {} 00:19:04.153 }' 00:19:04.153 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:04.153 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:04.412 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:04.670 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:04.671 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:04.671 15:58:09 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:04.929 [2024-06-10 15:58:10.192240] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:04.929 [2024-06-10 15:58:10.192265] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:04.929 [2024-06-10 15:58:10.192314] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:04.929 [2024-06-10 15:58:10.192376] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:04.929 [2024-06-10 15:58:10.192385] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c7e3a0 name Existed_Raid, state offline 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2730830 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2730830 ']' 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2730830 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2730830 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2730830' 00:19:04.929 killing process with pid 2730830 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2730830 00:19:04.929 [2024-06-10 15:58:10.249365] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:04.929 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2730830 00:19:04.930 [2024-06-10 15:58:10.284106] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:05.189 15:58:10 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:19:05.189 00:19:05.189 real 0m33.219s 00:19:05.189 user 1m2.289s 00:19:05.189 sys 0m4.696s 00:19:05.189 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:05.189 15:58:10 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:19:05.189 ************************************ 00:19:05.189 END TEST raid_state_function_test_sb 00:19:05.189 ************************************ 00:19:05.189 15:58:10 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test concat 4 00:19:05.189 15:58:10 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:19:05.189 15:58:10 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:05.189 15:58:10 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:05.189 ************************************ 00:19:05.189 START TEST raid_superblock_test 00:19:05.189 ************************************ 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test concat 4 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=concat 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' concat '!=' raid1 ']' 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@404 -- # strip_size=64 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@405 -- # strip_size_create_arg='-z 64' 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2736946 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2736946 /var/tmp/spdk-raid.sock 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2736946 ']' 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:05.189 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:05.189 15:58:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:05.189 [2024-06-10 15:58:10.609008] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:19:05.189 [2024-06-10 15:58:10.609061] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2736946 ] 00:19:05.448 [2024-06-10 15:58:10.706157] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:05.448 [2024-06-10 15:58:10.800266] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:05.448 [2024-06-10 15:58:10.862339] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:05.448 [2024-06-10 15:58:10.862371] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:06.384 15:58:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:19:06.385 malloc1 00:19:06.385 15:58:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:06.644 [2024-06-10 15:58:12.060690] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:06.644 [2024-06-10 15:58:12.060733] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:06.644 [2024-06-10 15:58:12.060750] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe8e0f0 00:19:06.644 [2024-06-10 15:58:12.060759] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:06.644 [2024-06-10 15:58:12.062460] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:06.644 [2024-06-10 15:58:12.062488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:06.644 pt1 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:06.644 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:19:06.902 malloc2 00:19:06.902 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:07.160 [2024-06-10 15:58:12.570846] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:07.160 [2024-06-10 15:58:12.570890] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.160 [2024-06-10 15:58:12.570904] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe8f400 00:19:07.160 [2024-06-10 15:58:12.570913] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.160 [2024-06-10 15:58:12.572467] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.160 [2024-06-10 15:58:12.572493] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:07.160 pt2 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:07.160 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:19:07.419 malloc3 00:19:07.419 15:58:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:07.678 [2024-06-10 15:58:13.080815] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:07.678 [2024-06-10 15:58:13.080859] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:07.678 [2024-06-10 15:58:13.080879] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103b200 00:19:07.678 [2024-06-10 15:58:13.080889] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:07.678 [2024-06-10 15:58:13.082523] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:07.678 [2024-06-10 15:58:13.082548] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:07.678 pt3 00:19:07.678 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:19:07.679 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:19:07.937 malloc4 00:19:07.937 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:08.196 [2024-06-10 15:58:13.594626] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:08.196 [2024-06-10 15:58:13.594668] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:08.196 [2024-06-10 15:58:13.594682] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103d320 00:19:08.196 [2024-06-10 15:58:13.594691] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:08.196 [2024-06-10 15:58:13.596224] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:08.196 [2024-06-10 15:58:13.596250] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:08.196 pt4 00:19:08.196 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:19:08.196 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:19:08.196 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:19:08.455 [2024-06-10 15:58:13.843306] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:08.455 [2024-06-10 15:58:13.844633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:08.455 [2024-06-10 15:58:13.844689] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:08.455 [2024-06-10 15:58:13.844735] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:08.455 [2024-06-10 15:58:13.844918] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x103f660 00:19:08.455 [2024-06-10 15:58:13.844929] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:08.455 [2024-06-10 15:58:13.845140] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea4fe0 00:19:08.455 [2024-06-10 15:58:13.845288] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x103f660 00:19:08.455 [2024-06-10 15:58:13.845296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x103f660 00:19:08.455 [2024-06-10 15:58:13.845391] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:08.455 15:58:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:08.714 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:08.714 "name": "raid_bdev1", 00:19:08.714 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:08.714 "strip_size_kb": 64, 00:19:08.714 "state": "online", 00:19:08.714 "raid_level": "concat", 00:19:08.714 "superblock": true, 00:19:08.714 "num_base_bdevs": 4, 00:19:08.714 "num_base_bdevs_discovered": 4, 00:19:08.714 "num_base_bdevs_operational": 4, 00:19:08.714 "base_bdevs_list": [ 00:19:08.714 { 00:19:08.714 "name": "pt1", 00:19:08.714 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:08.714 "is_configured": true, 00:19:08.714 "data_offset": 2048, 00:19:08.714 "data_size": 63488 00:19:08.714 }, 00:19:08.714 { 00:19:08.714 "name": "pt2", 00:19:08.714 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:08.714 "is_configured": true, 00:19:08.714 "data_offset": 2048, 00:19:08.714 "data_size": 63488 00:19:08.714 }, 00:19:08.714 { 00:19:08.714 "name": "pt3", 00:19:08.714 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:08.714 "is_configured": true, 00:19:08.714 "data_offset": 2048, 00:19:08.714 "data_size": 63488 00:19:08.714 }, 00:19:08.714 { 00:19:08.714 "name": "pt4", 00:19:08.714 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:08.714 "is_configured": true, 00:19:08.714 "data_offset": 2048, 00:19:08.714 "data_size": 63488 00:19:08.714 } 00:19:08.714 ] 00:19:08.714 }' 00:19:08.714 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:08.714 15:58:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:09.283 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:09.542 [2024-06-10 15:58:14.978612] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:09.542 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:09.542 "name": "raid_bdev1", 00:19:09.542 "aliases": [ 00:19:09.542 "1cf715be-221d-40aa-b20c-fd087fc4424c" 00:19:09.542 ], 00:19:09.542 "product_name": "Raid Volume", 00:19:09.542 "block_size": 512, 00:19:09.542 "num_blocks": 253952, 00:19:09.542 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:09.542 "assigned_rate_limits": { 00:19:09.542 "rw_ios_per_sec": 0, 00:19:09.542 "rw_mbytes_per_sec": 0, 00:19:09.542 "r_mbytes_per_sec": 0, 00:19:09.542 "w_mbytes_per_sec": 0 00:19:09.542 }, 00:19:09.542 "claimed": false, 00:19:09.542 "zoned": false, 00:19:09.542 "supported_io_types": { 00:19:09.542 "read": true, 00:19:09.542 "write": true, 00:19:09.542 "unmap": true, 00:19:09.542 "write_zeroes": true, 00:19:09.542 "flush": true, 00:19:09.542 "reset": true, 00:19:09.542 "compare": false, 00:19:09.542 "compare_and_write": false, 00:19:09.542 "abort": false, 00:19:09.542 "nvme_admin": false, 00:19:09.542 "nvme_io": false 00:19:09.542 }, 00:19:09.542 "memory_domains": [ 00:19:09.542 { 00:19:09.542 "dma_device_id": "system", 00:19:09.542 "dma_device_type": 1 00:19:09.542 }, 00:19:09.542 { 00:19:09.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.542 "dma_device_type": 2 00:19:09.542 }, 00:19:09.542 { 00:19:09.542 "dma_device_id": "system", 00:19:09.542 "dma_device_type": 1 00:19:09.542 }, 00:19:09.542 { 00:19:09.542 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.542 "dma_device_type": 2 00:19:09.542 }, 00:19:09.542 { 00:19:09.542 "dma_device_id": "system", 00:19:09.542 "dma_device_type": 1 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.543 "dma_device_type": 2 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "dma_device_id": "system", 00:19:09.543 "dma_device_type": 1 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.543 "dma_device_type": 2 00:19:09.543 } 00:19:09.543 ], 00:19:09.543 "driver_specific": { 00:19:09.543 "raid": { 00:19:09.543 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:09.543 "strip_size_kb": 64, 00:19:09.543 "state": "online", 00:19:09.543 "raid_level": "concat", 00:19:09.543 "superblock": true, 00:19:09.543 "num_base_bdevs": 4, 00:19:09.543 "num_base_bdevs_discovered": 4, 00:19:09.543 "num_base_bdevs_operational": 4, 00:19:09.543 "base_bdevs_list": [ 00:19:09.543 { 00:19:09.543 "name": "pt1", 00:19:09.543 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:09.543 "is_configured": true, 00:19:09.543 "data_offset": 2048, 00:19:09.543 "data_size": 63488 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "name": "pt2", 00:19:09.543 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:09.543 "is_configured": true, 00:19:09.543 "data_offset": 2048, 00:19:09.543 "data_size": 63488 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "name": "pt3", 00:19:09.543 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:09.543 "is_configured": true, 00:19:09.543 "data_offset": 2048, 00:19:09.543 "data_size": 63488 00:19:09.543 }, 00:19:09.543 { 00:19:09.543 "name": "pt4", 00:19:09.543 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:09.543 "is_configured": true, 00:19:09.543 "data_offset": 2048, 00:19:09.543 "data_size": 63488 00:19:09.543 } 00:19:09.543 ] 00:19:09.543 } 00:19:09.543 } 00:19:09.543 }' 00:19:09.543 15:58:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:09.543 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:09.543 pt2 00:19:09.543 pt3 00:19:09.543 pt4' 00:19:09.543 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:09.543 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:09.543 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:09.802 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:09.802 "name": "pt1", 00:19:09.802 "aliases": [ 00:19:09.802 "00000000-0000-0000-0000-000000000001" 00:19:09.802 ], 00:19:09.802 "product_name": "passthru", 00:19:09.802 "block_size": 512, 00:19:09.802 "num_blocks": 65536, 00:19:09.802 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:09.802 "assigned_rate_limits": { 00:19:09.802 "rw_ios_per_sec": 0, 00:19:09.802 "rw_mbytes_per_sec": 0, 00:19:09.802 "r_mbytes_per_sec": 0, 00:19:09.802 "w_mbytes_per_sec": 0 00:19:09.802 }, 00:19:09.802 "claimed": true, 00:19:09.802 "claim_type": "exclusive_write", 00:19:09.802 "zoned": false, 00:19:09.802 "supported_io_types": { 00:19:09.802 "read": true, 00:19:09.802 "write": true, 00:19:09.802 "unmap": true, 00:19:09.802 "write_zeroes": true, 00:19:09.802 "flush": true, 00:19:09.802 "reset": true, 00:19:09.802 "compare": false, 00:19:09.802 "compare_and_write": false, 00:19:09.802 "abort": true, 00:19:09.802 "nvme_admin": false, 00:19:09.802 "nvme_io": false 00:19:09.802 }, 00:19:09.802 "memory_domains": [ 00:19:09.802 { 00:19:09.802 "dma_device_id": "system", 00:19:09.802 "dma_device_type": 1 00:19:09.802 }, 00:19:09.802 { 00:19:09.802 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:09.802 "dma_device_type": 2 00:19:09.802 } 00:19:09.802 ], 00:19:09.803 "driver_specific": { 00:19:09.803 "passthru": { 00:19:09.803 "name": "pt1", 00:19:09.803 "base_bdev_name": "malloc1" 00:19:09.803 } 00:19:09.803 } 00:19:09.803 }' 00:19:09.803 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.061 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:10.321 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:10.580 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:10.580 "name": "pt2", 00:19:10.580 "aliases": [ 00:19:10.580 "00000000-0000-0000-0000-000000000002" 00:19:10.580 ], 00:19:10.580 "product_name": "passthru", 00:19:10.580 "block_size": 512, 00:19:10.580 "num_blocks": 65536, 00:19:10.580 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:10.580 "assigned_rate_limits": { 00:19:10.580 "rw_ios_per_sec": 0, 00:19:10.580 "rw_mbytes_per_sec": 0, 00:19:10.580 "r_mbytes_per_sec": 0, 00:19:10.580 "w_mbytes_per_sec": 0 00:19:10.580 }, 00:19:10.580 "claimed": true, 00:19:10.580 "claim_type": "exclusive_write", 00:19:10.580 "zoned": false, 00:19:10.580 "supported_io_types": { 00:19:10.580 "read": true, 00:19:10.580 "write": true, 00:19:10.580 "unmap": true, 00:19:10.580 "write_zeroes": true, 00:19:10.580 "flush": true, 00:19:10.580 "reset": true, 00:19:10.580 "compare": false, 00:19:10.580 "compare_and_write": false, 00:19:10.580 "abort": true, 00:19:10.580 "nvme_admin": false, 00:19:10.580 "nvme_io": false 00:19:10.580 }, 00:19:10.580 "memory_domains": [ 00:19:10.580 { 00:19:10.580 "dma_device_id": "system", 00:19:10.580 "dma_device_type": 1 00:19:10.580 }, 00:19:10.580 { 00:19:10.580 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:10.580 "dma_device_type": 2 00:19:10.580 } 00:19:10.580 ], 00:19:10.580 "driver_specific": { 00:19:10.580 "passthru": { 00:19:10.580 "name": "pt2", 00:19:10.580 "base_bdev_name": "malloc2" 00:19:10.580 } 00:19:10.580 } 00:19:10.580 }' 00:19:10.580 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.580 15:58:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:10.580 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:10.580 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.580 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:10.839 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.098 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.098 "name": "pt3", 00:19:11.098 "aliases": [ 00:19:11.098 "00000000-0000-0000-0000-000000000003" 00:19:11.098 ], 00:19:11.098 "product_name": "passthru", 00:19:11.098 "block_size": 512, 00:19:11.098 "num_blocks": 65536, 00:19:11.098 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:11.098 "assigned_rate_limits": { 00:19:11.098 "rw_ios_per_sec": 0, 00:19:11.098 "rw_mbytes_per_sec": 0, 00:19:11.098 "r_mbytes_per_sec": 0, 00:19:11.098 "w_mbytes_per_sec": 0 00:19:11.098 }, 00:19:11.098 "claimed": true, 00:19:11.098 "claim_type": "exclusive_write", 00:19:11.098 "zoned": false, 00:19:11.098 "supported_io_types": { 00:19:11.098 "read": true, 00:19:11.098 "write": true, 00:19:11.098 "unmap": true, 00:19:11.098 "write_zeroes": true, 00:19:11.098 "flush": true, 00:19:11.098 "reset": true, 00:19:11.098 "compare": false, 00:19:11.098 "compare_and_write": false, 00:19:11.098 "abort": true, 00:19:11.098 "nvme_admin": false, 00:19:11.098 "nvme_io": false 00:19:11.098 }, 00:19:11.098 "memory_domains": [ 00:19:11.098 { 00:19:11.098 "dma_device_id": "system", 00:19:11.098 "dma_device_type": 1 00:19:11.098 }, 00:19:11.098 { 00:19:11.098 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.098 "dma_device_type": 2 00:19:11.098 } 00:19:11.098 ], 00:19:11.098 "driver_specific": { 00:19:11.098 "passthru": { 00:19:11.098 "name": "pt3", 00:19:11.098 "base_bdev_name": "malloc3" 00:19:11.098 } 00:19:11.098 } 00:19:11.098 }' 00:19:11.098 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.098 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:11.357 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.616 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:11.616 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:11.616 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:11.616 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:11.616 15:58:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:11.875 "name": "pt4", 00:19:11.875 "aliases": [ 00:19:11.875 "00000000-0000-0000-0000-000000000004" 00:19:11.875 ], 00:19:11.875 "product_name": "passthru", 00:19:11.875 "block_size": 512, 00:19:11.875 "num_blocks": 65536, 00:19:11.875 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:11.875 "assigned_rate_limits": { 00:19:11.875 "rw_ios_per_sec": 0, 00:19:11.875 "rw_mbytes_per_sec": 0, 00:19:11.875 "r_mbytes_per_sec": 0, 00:19:11.875 "w_mbytes_per_sec": 0 00:19:11.875 }, 00:19:11.875 "claimed": true, 00:19:11.875 "claim_type": "exclusive_write", 00:19:11.875 "zoned": false, 00:19:11.875 "supported_io_types": { 00:19:11.875 "read": true, 00:19:11.875 "write": true, 00:19:11.875 "unmap": true, 00:19:11.875 "write_zeroes": true, 00:19:11.875 "flush": true, 00:19:11.875 "reset": true, 00:19:11.875 "compare": false, 00:19:11.875 "compare_and_write": false, 00:19:11.875 "abort": true, 00:19:11.875 "nvme_admin": false, 00:19:11.875 "nvme_io": false 00:19:11.875 }, 00:19:11.875 "memory_domains": [ 00:19:11.875 { 00:19:11.875 "dma_device_id": "system", 00:19:11.875 "dma_device_type": 1 00:19:11.875 }, 00:19:11.875 { 00:19:11.875 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:11.875 "dma_device_type": 2 00:19:11.875 } 00:19:11.875 ], 00:19:11.875 "driver_specific": { 00:19:11.875 "passthru": { 00:19:11.875 "name": "pt4", 00:19:11.875 "base_bdev_name": "malloc4" 00:19:11.875 } 00:19:11.875 } 00:19:11.875 }' 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:11.875 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:12.134 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:19:12.393 [2024-06-10 15:58:17.774098] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:12.393 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=1cf715be-221d-40aa-b20c-fd087fc4424c 00:19:12.393 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 1cf715be-221d-40aa-b20c-fd087fc4424c ']' 00:19:12.393 15:58:17 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:12.652 [2024-06-10 15:58:18.030480] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:12.652 [2024-06-10 15:58:18.030498] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:12.652 [2024-06-10 15:58:18.030546] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:12.652 [2024-06-10 15:58:18.030607] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:12.652 [2024-06-10 15:58:18.030616] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103f660 name raid_bdev1, state offline 00:19:12.652 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:12.652 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:19:12.911 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:19:12.911 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:19:12.911 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:12.911 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:19:13.170 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:13.170 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:13.429 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:13.429 15:58:18 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:19:13.688 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:19:13.688 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:19:13.948 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:19:13.948 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:19:14.207 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:19:14.466 [2024-06-10 15:58:19.811170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:19:14.466 [2024-06-10 15:58:19.812590] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:19:14.466 [2024-06-10 15:58:19.812633] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:19:14.466 [2024-06-10 15:58:19.812669] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:19:14.466 [2024-06-10 15:58:19.812713] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:19:14.466 [2024-06-10 15:58:19.812748] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:19:14.466 [2024-06-10 15:58:19.812769] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:19:14.466 [2024-06-10 15:58:19.812788] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:19:14.466 [2024-06-10 15:58:19.812803] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:14.466 [2024-06-10 15:58:19.812810] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x103db70 name raid_bdev1, state configuring 00:19:14.466 request: 00:19:14.466 { 00:19:14.466 "name": "raid_bdev1", 00:19:14.466 "raid_level": "concat", 00:19:14.466 "base_bdevs": [ 00:19:14.466 "malloc1", 00:19:14.466 "malloc2", 00:19:14.466 "malloc3", 00:19:14.466 "malloc4" 00:19:14.466 ], 00:19:14.466 "superblock": false, 00:19:14.466 "strip_size_kb": 64, 00:19:14.466 "method": "bdev_raid_create", 00:19:14.466 "req_id": 1 00:19:14.466 } 00:19:14.466 Got JSON-RPC error response 00:19:14.466 response: 00:19:14.466 { 00:19:14.466 "code": -17, 00:19:14.466 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:19:14.466 } 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.466 15:58:19 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:19:14.725 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:19:14.726 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:19:14.726 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:19:14.985 [2024-06-10 15:58:20.324455] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:19:14.985 [2024-06-10 15:58:20.324492] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:14.985 [2024-06-10 15:58:20.324507] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103cd20 00:19:14.985 [2024-06-10 15:58:20.324516] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:14.985 [2024-06-10 15:58:20.326170] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:14.985 [2024-06-10 15:58:20.326202] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:19:14.985 [2024-06-10 15:58:20.326262] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:19:14.985 [2024-06-10 15:58:20.326287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:19:14.985 pt1 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:14.985 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:15.244 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:15.244 "name": "raid_bdev1", 00:19:15.244 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:15.244 "strip_size_kb": 64, 00:19:15.244 "state": "configuring", 00:19:15.244 "raid_level": "concat", 00:19:15.244 "superblock": true, 00:19:15.244 "num_base_bdevs": 4, 00:19:15.244 "num_base_bdevs_discovered": 1, 00:19:15.244 "num_base_bdevs_operational": 4, 00:19:15.244 "base_bdevs_list": [ 00:19:15.244 { 00:19:15.244 "name": "pt1", 00:19:15.244 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:15.244 "is_configured": true, 00:19:15.244 "data_offset": 2048, 00:19:15.244 "data_size": 63488 00:19:15.244 }, 00:19:15.244 { 00:19:15.244 "name": null, 00:19:15.244 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:15.244 "is_configured": false, 00:19:15.244 "data_offset": 2048, 00:19:15.244 "data_size": 63488 00:19:15.244 }, 00:19:15.244 { 00:19:15.244 "name": null, 00:19:15.244 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:15.244 "is_configured": false, 00:19:15.244 "data_offset": 2048, 00:19:15.244 "data_size": 63488 00:19:15.244 }, 00:19:15.244 { 00:19:15.244 "name": null, 00:19:15.244 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:15.244 "is_configured": false, 00:19:15.244 "data_offset": 2048, 00:19:15.244 "data_size": 63488 00:19:15.244 } 00:19:15.244 ] 00:19:15.244 }' 00:19:15.244 15:58:20 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:15.244 15:58:20 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:15.812 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:19:15.812 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:16.071 [2024-06-10 15:58:21.447485] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:16.071 [2024-06-10 15:58:21.447531] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:16.071 [2024-06-10 15:58:21.447548] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103c3c0 00:19:16.071 [2024-06-10 15:58:21.447558] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:16.071 [2024-06-10 15:58:21.447898] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:16.071 [2024-06-10 15:58:21.447912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:16.071 [2024-06-10 15:58:21.447983] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:16.071 [2024-06-10 15:58:21.448001] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:16.071 pt2 00:19:16.071 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:19:16.330 [2024-06-10 15:58:21.704184] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring concat 64 4 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:16.330 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:16.589 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:16.589 "name": "raid_bdev1", 00:19:16.589 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:16.589 "strip_size_kb": 64, 00:19:16.589 "state": "configuring", 00:19:16.589 "raid_level": "concat", 00:19:16.589 "superblock": true, 00:19:16.589 "num_base_bdevs": 4, 00:19:16.589 "num_base_bdevs_discovered": 1, 00:19:16.589 "num_base_bdevs_operational": 4, 00:19:16.589 "base_bdevs_list": [ 00:19:16.589 { 00:19:16.589 "name": "pt1", 00:19:16.589 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:16.589 "is_configured": true, 00:19:16.589 "data_offset": 2048, 00:19:16.589 "data_size": 63488 00:19:16.589 }, 00:19:16.589 { 00:19:16.589 "name": null, 00:19:16.589 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:16.589 "is_configured": false, 00:19:16.589 "data_offset": 2048, 00:19:16.589 "data_size": 63488 00:19:16.589 }, 00:19:16.589 { 00:19:16.589 "name": null, 00:19:16.589 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:16.589 "is_configured": false, 00:19:16.589 "data_offset": 2048, 00:19:16.589 "data_size": 63488 00:19:16.589 }, 00:19:16.589 { 00:19:16.589 "name": null, 00:19:16.589 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:16.589 "is_configured": false, 00:19:16.589 "data_offset": 2048, 00:19:16.589 "data_size": 63488 00:19:16.589 } 00:19:16.589 ] 00:19:16.589 }' 00:19:16.589 15:58:21 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:16.589 15:58:21 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:17.218 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:19:17.218 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:17.218 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:19:17.478 [2024-06-10 15:58:22.831196] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:19:17.478 [2024-06-10 15:58:22.831241] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.478 [2024-06-10 15:58:22.831256] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10383f0 00:19:17.478 [2024-06-10 15:58:22.831265] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.478 [2024-06-10 15:58:22.831601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.478 [2024-06-10 15:58:22.831616] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:19:17.478 [2024-06-10 15:58:22.831677] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:19:17.478 [2024-06-10 15:58:22.831696] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:19:17.478 pt2 00:19:17.478 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:17.478 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:17.478 15:58:22 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:19:17.737 [2024-06-10 15:58:23.087888] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:19:17.737 [2024-06-10 15:58:23.087918] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.737 [2024-06-10 15:58:23.087931] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1038660 00:19:17.737 [2024-06-10 15:58:23.087940] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.737 [2024-06-10 15:58:23.088244] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.737 [2024-06-10 15:58:23.088258] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:19:17.737 [2024-06-10 15:58:23.088308] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:19:17.737 [2024-06-10 15:58:23.088325] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:19:17.737 pt3 00:19:17.737 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:17.737 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:17.737 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:19:17.997 [2024-06-10 15:58:23.332550] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:19:17.997 [2024-06-10 15:58:23.332579] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:17.997 [2024-06-10 15:58:23.332592] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x103e660 00:19:17.997 [2024-06-10 15:58:23.332601] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:17.997 [2024-06-10 15:58:23.332893] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:17.997 [2024-06-10 15:58:23.332907] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:19:17.997 [2024-06-10 15:58:23.332964] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:19:17.997 [2024-06-10 15:58:23.332981] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:19:17.997 [2024-06-10 15:58:23.333103] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1038b50 00:19:17.997 [2024-06-10 15:58:23.333111] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:17.997 [2024-06-10 15:58:23.333284] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xea4fe0 00:19:17.997 [2024-06-10 15:58:23.333415] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1038b50 00:19:17.997 [2024-06-10 15:58:23.333423] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1038b50 00:19:17.997 [2024-06-10 15:58:23.333518] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:17.997 pt4 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:17.997 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:18.256 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:18.256 "name": "raid_bdev1", 00:19:18.256 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:18.256 "strip_size_kb": 64, 00:19:18.256 "state": "online", 00:19:18.256 "raid_level": "concat", 00:19:18.256 "superblock": true, 00:19:18.256 "num_base_bdevs": 4, 00:19:18.256 "num_base_bdevs_discovered": 4, 00:19:18.256 "num_base_bdevs_operational": 4, 00:19:18.256 "base_bdevs_list": [ 00:19:18.256 { 00:19:18.256 "name": "pt1", 00:19:18.256 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:18.256 "is_configured": true, 00:19:18.256 "data_offset": 2048, 00:19:18.256 "data_size": 63488 00:19:18.256 }, 00:19:18.256 { 00:19:18.256 "name": "pt2", 00:19:18.256 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:18.256 "is_configured": true, 00:19:18.256 "data_offset": 2048, 00:19:18.256 "data_size": 63488 00:19:18.256 }, 00:19:18.256 { 00:19:18.256 "name": "pt3", 00:19:18.256 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:18.256 "is_configured": true, 00:19:18.256 "data_offset": 2048, 00:19:18.256 "data_size": 63488 00:19:18.256 }, 00:19:18.256 { 00:19:18.256 "name": "pt4", 00:19:18.256 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:18.256 "is_configured": true, 00:19:18.256 "data_offset": 2048, 00:19:18.256 "data_size": 63488 00:19:18.256 } 00:19:18.256 ] 00:19:18.256 }' 00:19:18.256 15:58:23 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:18.256 15:58:23 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:18.825 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:19.084 [2024-06-10 15:58:24.395686] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:19.084 "name": "raid_bdev1", 00:19:19.084 "aliases": [ 00:19:19.084 "1cf715be-221d-40aa-b20c-fd087fc4424c" 00:19:19.084 ], 00:19:19.084 "product_name": "Raid Volume", 00:19:19.084 "block_size": 512, 00:19:19.084 "num_blocks": 253952, 00:19:19.084 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:19.084 "assigned_rate_limits": { 00:19:19.084 "rw_ios_per_sec": 0, 00:19:19.084 "rw_mbytes_per_sec": 0, 00:19:19.084 "r_mbytes_per_sec": 0, 00:19:19.084 "w_mbytes_per_sec": 0 00:19:19.084 }, 00:19:19.084 "claimed": false, 00:19:19.084 "zoned": false, 00:19:19.084 "supported_io_types": { 00:19:19.084 "read": true, 00:19:19.084 "write": true, 00:19:19.084 "unmap": true, 00:19:19.084 "write_zeroes": true, 00:19:19.084 "flush": true, 00:19:19.084 "reset": true, 00:19:19.084 "compare": false, 00:19:19.084 "compare_and_write": false, 00:19:19.084 "abort": false, 00:19:19.084 "nvme_admin": false, 00:19:19.084 "nvme_io": false 00:19:19.084 }, 00:19:19.084 "memory_domains": [ 00:19:19.084 { 00:19:19.084 "dma_device_id": "system", 00:19:19.084 "dma_device_type": 1 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.084 "dma_device_type": 2 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "system", 00:19:19.084 "dma_device_type": 1 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.084 "dma_device_type": 2 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "system", 00:19:19.084 "dma_device_type": 1 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.084 "dma_device_type": 2 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "system", 00:19:19.084 "dma_device_type": 1 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.084 "dma_device_type": 2 00:19:19.084 } 00:19:19.084 ], 00:19:19.084 "driver_specific": { 00:19:19.084 "raid": { 00:19:19.084 "uuid": "1cf715be-221d-40aa-b20c-fd087fc4424c", 00:19:19.084 "strip_size_kb": 64, 00:19:19.084 "state": "online", 00:19:19.084 "raid_level": "concat", 00:19:19.084 "superblock": true, 00:19:19.084 "num_base_bdevs": 4, 00:19:19.084 "num_base_bdevs_discovered": 4, 00:19:19.084 "num_base_bdevs_operational": 4, 00:19:19.084 "base_bdevs_list": [ 00:19:19.084 { 00:19:19.084 "name": "pt1", 00:19:19.084 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:19.084 "is_configured": true, 00:19:19.084 "data_offset": 2048, 00:19:19.084 "data_size": 63488 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "name": "pt2", 00:19:19.084 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:19.084 "is_configured": true, 00:19:19.084 "data_offset": 2048, 00:19:19.084 "data_size": 63488 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "name": "pt3", 00:19:19.084 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:19.084 "is_configured": true, 00:19:19.084 "data_offset": 2048, 00:19:19.084 "data_size": 63488 00:19:19.084 }, 00:19:19.084 { 00:19:19.084 "name": "pt4", 00:19:19.084 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:19.084 "is_configured": true, 00:19:19.084 "data_offset": 2048, 00:19:19.084 "data_size": 63488 00:19:19.084 } 00:19:19.084 ] 00:19:19.084 } 00:19:19.084 } 00:19:19.084 }' 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:19:19.084 pt2 00:19:19.084 pt3 00:19:19.084 pt4' 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:19:19.084 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:19.343 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:19.343 "name": "pt1", 00:19:19.343 "aliases": [ 00:19:19.343 "00000000-0000-0000-0000-000000000001" 00:19:19.343 ], 00:19:19.343 "product_name": "passthru", 00:19:19.343 "block_size": 512, 00:19:19.343 "num_blocks": 65536, 00:19:19.343 "uuid": "00000000-0000-0000-0000-000000000001", 00:19:19.343 "assigned_rate_limits": { 00:19:19.343 "rw_ios_per_sec": 0, 00:19:19.343 "rw_mbytes_per_sec": 0, 00:19:19.343 "r_mbytes_per_sec": 0, 00:19:19.343 "w_mbytes_per_sec": 0 00:19:19.343 }, 00:19:19.343 "claimed": true, 00:19:19.343 "claim_type": "exclusive_write", 00:19:19.343 "zoned": false, 00:19:19.343 "supported_io_types": { 00:19:19.343 "read": true, 00:19:19.343 "write": true, 00:19:19.343 "unmap": true, 00:19:19.343 "write_zeroes": true, 00:19:19.343 "flush": true, 00:19:19.343 "reset": true, 00:19:19.343 "compare": false, 00:19:19.343 "compare_and_write": false, 00:19:19.343 "abort": true, 00:19:19.343 "nvme_admin": false, 00:19:19.343 "nvme_io": false 00:19:19.343 }, 00:19:19.343 "memory_domains": [ 00:19:19.343 { 00:19:19.343 "dma_device_id": "system", 00:19:19.343 "dma_device_type": 1 00:19:19.343 }, 00:19:19.343 { 00:19:19.343 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.343 "dma_device_type": 2 00:19:19.343 } 00:19:19.343 ], 00:19:19.343 "driver_specific": { 00:19:19.343 "passthru": { 00:19:19.343 "name": "pt1", 00:19:19.343 "base_bdev_name": "malloc1" 00:19:19.343 } 00:19:19.343 } 00:19:19.343 }' 00:19:19.343 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.343 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.343 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:19.343 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:19.602 15:58:24 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:19.602 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:19.602 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:19.602 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:19.602 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:19:19.602 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:19.861 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:19.861 "name": "pt2", 00:19:19.861 "aliases": [ 00:19:19.861 "00000000-0000-0000-0000-000000000002" 00:19:19.861 ], 00:19:19.861 "product_name": "passthru", 00:19:19.861 "block_size": 512, 00:19:19.861 "num_blocks": 65536, 00:19:19.861 "uuid": "00000000-0000-0000-0000-000000000002", 00:19:19.861 "assigned_rate_limits": { 00:19:19.861 "rw_ios_per_sec": 0, 00:19:19.861 "rw_mbytes_per_sec": 0, 00:19:19.861 "r_mbytes_per_sec": 0, 00:19:19.861 "w_mbytes_per_sec": 0 00:19:19.861 }, 00:19:19.861 "claimed": true, 00:19:19.861 "claim_type": "exclusive_write", 00:19:19.861 "zoned": false, 00:19:19.861 "supported_io_types": { 00:19:19.861 "read": true, 00:19:19.861 "write": true, 00:19:19.861 "unmap": true, 00:19:19.861 "write_zeroes": true, 00:19:19.861 "flush": true, 00:19:19.861 "reset": true, 00:19:19.861 "compare": false, 00:19:19.861 "compare_and_write": false, 00:19:19.861 "abort": true, 00:19:19.861 "nvme_admin": false, 00:19:19.861 "nvme_io": false 00:19:19.861 }, 00:19:19.861 "memory_domains": [ 00:19:19.861 { 00:19:19.861 "dma_device_id": "system", 00:19:19.861 "dma_device_type": 1 00:19:19.861 }, 00:19:19.861 { 00:19:19.861 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:19.861 "dma_device_type": 2 00:19:19.861 } 00:19:19.861 ], 00:19:19.861 "driver_specific": { 00:19:19.861 "passthru": { 00:19:19.861 "name": "pt2", 00:19:19.861 "base_bdev_name": "malloc2" 00:19:19.861 } 00:19:19.861 } 00:19:19.861 }' 00:19:19.861 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.861 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:19.861 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:19.861 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.120 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.379 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:20.379 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:20.379 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:19:20.379 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:20.638 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:20.638 "name": "pt3", 00:19:20.638 "aliases": [ 00:19:20.638 "00000000-0000-0000-0000-000000000003" 00:19:20.638 ], 00:19:20.638 "product_name": "passthru", 00:19:20.638 "block_size": 512, 00:19:20.638 "num_blocks": 65536, 00:19:20.638 "uuid": "00000000-0000-0000-0000-000000000003", 00:19:20.638 "assigned_rate_limits": { 00:19:20.638 "rw_ios_per_sec": 0, 00:19:20.638 "rw_mbytes_per_sec": 0, 00:19:20.638 "r_mbytes_per_sec": 0, 00:19:20.638 "w_mbytes_per_sec": 0 00:19:20.638 }, 00:19:20.638 "claimed": true, 00:19:20.638 "claim_type": "exclusive_write", 00:19:20.638 "zoned": false, 00:19:20.638 "supported_io_types": { 00:19:20.638 "read": true, 00:19:20.638 "write": true, 00:19:20.638 "unmap": true, 00:19:20.638 "write_zeroes": true, 00:19:20.638 "flush": true, 00:19:20.638 "reset": true, 00:19:20.638 "compare": false, 00:19:20.638 "compare_and_write": false, 00:19:20.638 "abort": true, 00:19:20.638 "nvme_admin": false, 00:19:20.638 "nvme_io": false 00:19:20.638 }, 00:19:20.638 "memory_domains": [ 00:19:20.638 { 00:19:20.638 "dma_device_id": "system", 00:19:20.638 "dma_device_type": 1 00:19:20.638 }, 00:19:20.638 { 00:19:20.638 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:20.638 "dma_device_type": 2 00:19:20.638 } 00:19:20.638 ], 00:19:20.638 "driver_specific": { 00:19:20.638 "passthru": { 00:19:20.638 "name": "pt3", 00:19:20.638 "base_bdev_name": "malloc3" 00:19:20.638 } 00:19:20.638 } 00:19:20.638 }' 00:19:20.638 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.638 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:20.638 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:20.638 15:58:25 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.638 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:20.638 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:20.638 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.638 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:19:20.897 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:21.156 "name": "pt4", 00:19:21.156 "aliases": [ 00:19:21.156 "00000000-0000-0000-0000-000000000004" 00:19:21.156 ], 00:19:21.156 "product_name": "passthru", 00:19:21.156 "block_size": 512, 00:19:21.156 "num_blocks": 65536, 00:19:21.156 "uuid": "00000000-0000-0000-0000-000000000004", 00:19:21.156 "assigned_rate_limits": { 00:19:21.156 "rw_ios_per_sec": 0, 00:19:21.156 "rw_mbytes_per_sec": 0, 00:19:21.156 "r_mbytes_per_sec": 0, 00:19:21.156 "w_mbytes_per_sec": 0 00:19:21.156 }, 00:19:21.156 "claimed": true, 00:19:21.156 "claim_type": "exclusive_write", 00:19:21.156 "zoned": false, 00:19:21.156 "supported_io_types": { 00:19:21.156 "read": true, 00:19:21.156 "write": true, 00:19:21.156 "unmap": true, 00:19:21.156 "write_zeroes": true, 00:19:21.156 "flush": true, 00:19:21.156 "reset": true, 00:19:21.156 "compare": false, 00:19:21.156 "compare_and_write": false, 00:19:21.156 "abort": true, 00:19:21.156 "nvme_admin": false, 00:19:21.156 "nvme_io": false 00:19:21.156 }, 00:19:21.156 "memory_domains": [ 00:19:21.156 { 00:19:21.156 "dma_device_id": "system", 00:19:21.156 "dma_device_type": 1 00:19:21.156 }, 00:19:21.156 { 00:19:21.156 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:21.156 "dma_device_type": 2 00:19:21.156 } 00:19:21.156 ], 00:19:21.156 "driver_specific": { 00:19:21.156 "passthru": { 00:19:21.156 "name": "pt4", 00:19:21.156 "base_bdev_name": "malloc4" 00:19:21.156 } 00:19:21.156 } 00:19:21.156 }' 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.156 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:19:21.416 15:58:26 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:19:21.675 [2024-06-10 15:58:27.127027] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 1cf715be-221d-40aa-b20c-fd087fc4424c '!=' 1cf715be-221d-40aa-b20c-fd087fc4424c ']' 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy concat 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2736946 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2736946 ']' 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2736946 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:21.675 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2736946 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2736946' 00:19:21.934 killing process with pid 2736946 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2736946 00:19:21.934 [2024-06-10 15:58:27.192259] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:21.934 [2024-06-10 15:58:27.192319] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:21.934 [2024-06-10 15:58:27.192387] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:21.934 [2024-06-10 15:58:27.192397] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1038b50 name raid_bdev1, state offline 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2736946 00:19:21.934 [2024-06-10 15:58:27.226470] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:19:21.934 00:19:21.934 real 0m16.872s 00:19:21.934 user 0m31.192s 00:19:21.934 sys 0m2.340s 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:21.934 15:58:27 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:19:21.934 ************************************ 00:19:21.934 END TEST raid_superblock_test 00:19:21.934 ************************************ 00:19:22.193 15:58:27 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test concat 4 read 00:19:22.193 15:58:27 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:22.193 15:58:27 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:22.193 15:58:27 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:22.193 ************************************ 00:19:22.193 START TEST raid_read_error_test 00:19:22.193 ************************************ 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 read 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.aTEVj3pkRI 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2740002 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2740002 /var/tmp/spdk-raid.sock 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2740002 ']' 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:22.193 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:22.193 15:58:27 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:22.193 [2024-06-10 15:58:27.562347] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:19:22.193 [2024-06-10 15:58:27.562400] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2740002 ] 00:19:22.193 [2024-06-10 15:58:27.650486] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:22.453 [2024-06-10 15:58:27.743996] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:22.453 [2024-06-10 15:58:27.801362] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:22.453 [2024-06-10 15:58:27.801394] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:23.020 15:58:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:23.020 15:58:28 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:19:23.020 15:58:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:23.020 15:58:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:23.278 BaseBdev1_malloc 00:19:23.278 15:58:28 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:23.536 true 00:19:23.536 15:58:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:23.794 [2024-06-10 15:58:29.259581] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:23.795 [2024-06-10 15:58:29.259626] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:23.795 [2024-06-10 15:58:29.259644] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f9150 00:19:23.795 [2024-06-10 15:58:29.259654] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:23.795 [2024-06-10 15:58:29.261456] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:23.795 [2024-06-10 15:58:29.261484] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:23.795 BaseBdev1 00:19:23.795 15:58:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:23.795 15:58:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:24.053 BaseBdev2_malloc 00:19:24.053 15:58:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:24.312 true 00:19:24.312 15:58:29 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:24.572 [2024-06-10 15:58:30.030083] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:24.572 [2024-06-10 15:58:30.030128] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:24.572 [2024-06-10 15:58:30.030150] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28fdb50 00:19:24.572 [2024-06-10 15:58:30.030161] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:24.572 [2024-06-10 15:58:30.031815] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:24.572 [2024-06-10 15:58:30.031844] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:24.572 BaseBdev2 00:19:24.572 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:24.572 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:24.831 BaseBdev3_malloc 00:19:24.831 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:25.090 true 00:19:25.090 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:25.348 [2024-06-10 15:58:30.800610] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:25.348 [2024-06-10 15:58:30.800648] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:25.348 [2024-06-10 15:58:30.800666] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28fe780 00:19:25.348 [2024-06-10 15:58:30.800675] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:25.348 [2024-06-10 15:58:30.802202] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:25.348 [2024-06-10 15:58:30.802228] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:25.348 BaseBdev3 00:19:25.348 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:25.348 15:58:30 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:25.606 BaseBdev4_malloc 00:19:25.606 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:25.864 true 00:19:25.864 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:26.122 [2024-06-10 15:58:31.575137] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:26.122 [2024-06-10 15:58:31.575180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:26.122 [2024-06-10 15:58:31.575200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x28f7ee0 00:19:26.122 [2024-06-10 15:58:31.575210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:26.122 [2024-06-10 15:58:31.576809] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:26.122 [2024-06-10 15:58:31.576846] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:26.122 BaseBdev4 00:19:26.122 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:26.380 [2024-06-10 15:58:31.827838] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:26.381 [2024-06-10 15:58:31.829192] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:26.381 [2024-06-10 15:58:31.829262] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:26.381 [2024-06-10 15:58:31.829327] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:26.381 [2024-06-10 15:58:31.829563] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x29015f0 00:19:26.381 [2024-06-10 15:58:31.829574] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:26.381 [2024-06-10 15:58:31.829765] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2748510 00:19:26.381 [2024-06-10 15:58:31.829925] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x29015f0 00:19:26.381 [2024-06-10 15:58:31.829934] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x29015f0 00:19:26.381 [2024-06-10 15:58:31.830046] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:26.381 15:58:31 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:26.639 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:26.639 "name": "raid_bdev1", 00:19:26.639 "uuid": "44c46837-759e-4245-af1b-3b706dd94e6f", 00:19:26.639 "strip_size_kb": 64, 00:19:26.639 "state": "online", 00:19:26.639 "raid_level": "concat", 00:19:26.639 "superblock": true, 00:19:26.639 "num_base_bdevs": 4, 00:19:26.639 "num_base_bdevs_discovered": 4, 00:19:26.639 "num_base_bdevs_operational": 4, 00:19:26.639 "base_bdevs_list": [ 00:19:26.639 { 00:19:26.639 "name": "BaseBdev1", 00:19:26.639 "uuid": "5939077b-850e-5b82-862b-181c8c5ba8e9", 00:19:26.639 "is_configured": true, 00:19:26.639 "data_offset": 2048, 00:19:26.639 "data_size": 63488 00:19:26.639 }, 00:19:26.639 { 00:19:26.639 "name": "BaseBdev2", 00:19:26.639 "uuid": "77d819c8-e35e-58f7-9e7f-be4bcb8b53f8", 00:19:26.639 "is_configured": true, 00:19:26.639 "data_offset": 2048, 00:19:26.639 "data_size": 63488 00:19:26.639 }, 00:19:26.639 { 00:19:26.639 "name": "BaseBdev3", 00:19:26.639 "uuid": "82e1c451-806c-5289-911a-f16896d14efb", 00:19:26.639 "is_configured": true, 00:19:26.639 "data_offset": 2048, 00:19:26.639 "data_size": 63488 00:19:26.639 }, 00:19:26.639 { 00:19:26.639 "name": "BaseBdev4", 00:19:26.639 "uuid": "eb92986b-e7f9-544f-a989-7761d6545060", 00:19:26.639 "is_configured": true, 00:19:26.639 "data_offset": 2048, 00:19:26.639 "data_size": 63488 00:19:26.639 } 00:19:26.639 ] 00:19:26.639 }' 00:19:26.639 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:26.639 15:58:32 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:27.575 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:27.575 15:58:32 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:27.575 [2024-06-10 15:58:32.838775] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28fa4a0 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:28.510 15:58:33 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:28.769 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:28.769 "name": "raid_bdev1", 00:19:28.769 "uuid": "44c46837-759e-4245-af1b-3b706dd94e6f", 00:19:28.769 "strip_size_kb": 64, 00:19:28.769 "state": "online", 00:19:28.769 "raid_level": "concat", 00:19:28.769 "superblock": true, 00:19:28.769 "num_base_bdevs": 4, 00:19:28.769 "num_base_bdevs_discovered": 4, 00:19:28.769 "num_base_bdevs_operational": 4, 00:19:28.769 "base_bdevs_list": [ 00:19:28.769 { 00:19:28.769 "name": "BaseBdev1", 00:19:28.769 "uuid": "5939077b-850e-5b82-862b-181c8c5ba8e9", 00:19:28.769 "is_configured": true, 00:19:28.769 "data_offset": 2048, 00:19:28.769 "data_size": 63488 00:19:28.769 }, 00:19:28.769 { 00:19:28.769 "name": "BaseBdev2", 00:19:28.769 "uuid": "77d819c8-e35e-58f7-9e7f-be4bcb8b53f8", 00:19:28.769 "is_configured": true, 00:19:28.769 "data_offset": 2048, 00:19:28.769 "data_size": 63488 00:19:28.769 }, 00:19:28.769 { 00:19:28.769 "name": "BaseBdev3", 00:19:28.769 "uuid": "82e1c451-806c-5289-911a-f16896d14efb", 00:19:28.769 "is_configured": true, 00:19:28.769 "data_offset": 2048, 00:19:28.769 "data_size": 63488 00:19:28.769 }, 00:19:28.769 { 00:19:28.769 "name": "BaseBdev4", 00:19:28.769 "uuid": "eb92986b-e7f9-544f-a989-7761d6545060", 00:19:28.769 "is_configured": true, 00:19:28.769 "data_offset": 2048, 00:19:28.769 "data_size": 63488 00:19:28.769 } 00:19:28.769 ] 00:19:28.769 }' 00:19:28.769 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:28.769 15:58:34 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:29.336 15:58:34 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:29.594 [2024-06-10 15:58:35.053915] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:29.594 [2024-06-10 15:58:35.053949] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:29.594 [2024-06-10 15:58:35.057392] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:29.594 [2024-06-10 15:58:35.057431] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:29.594 [2024-06-10 15:58:35.057470] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:29.594 [2024-06-10 15:58:35.057478] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x29015f0 name raid_bdev1, state offline 00:19:29.594 0 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2740002 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2740002 ']' 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2740002 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:29.594 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2740002 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2740002' 00:19:29.853 killing process with pid 2740002 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2740002 00:19:29.853 [2024-06-10 15:58:35.120219] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2740002 00:19:29.853 [2024-06-10 15:58:35.148767] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.aTEVj3pkRI 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.45 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:29.853 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:30.112 15:58:35 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.45 != \0\.\0\0 ]] 00:19:30.112 00:19:30.112 real 0m7.873s 00:19:30.112 user 0m12.958s 00:19:30.112 sys 0m1.085s 00:19:30.112 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:30.112 15:58:35 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.112 ************************************ 00:19:30.112 END TEST raid_read_error_test 00:19:30.112 ************************************ 00:19:30.112 15:58:35 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test concat 4 write 00:19:30.112 15:58:35 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:30.112 15:58:35 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:30.112 15:58:35 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:30.112 ************************************ 00:19:30.112 START TEST raid_write_error_test 00:19:30.112 ************************************ 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test concat 4 write 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=concat 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' concat '!=' raid1 ']' 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@799 -- # strip_size=64 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@800 -- # create_arg+=' -z 64' 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.zaYzqpl8Oa 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2741414 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2741414 /var/tmp/spdk-raid.sock 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2741414 ']' 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:30.112 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:30.112 15:58:35 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:30.112 [2024-06-10 15:58:35.506900] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:19:30.112 [2024-06-10 15:58:35.506962] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2741414 ] 00:19:30.112 [2024-06-10 15:58:35.605654] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:30.371 [2024-06-10 15:58:35.695935] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:30.371 [2024-06-10 15:58:35.760901] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:30.371 [2024-06-10 15:58:35.760943] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:31.307 15:58:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:31.307 15:58:36 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:19:31.307 15:58:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:31.307 15:58:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:19:31.307 BaseBdev1_malloc 00:19:31.307 15:58:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:19:31.565 true 00:19:31.565 15:58:36 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:19:31.824 [2024-06-10 15:58:37.212545] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:19:31.824 [2024-06-10 15:58:37.212586] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:31.824 [2024-06-10 15:58:37.212601] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2859150 00:19:31.824 [2024-06-10 15:58:37.212610] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:31.824 [2024-06-10 15:58:37.214341] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:31.824 [2024-06-10 15:58:37.214365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:19:31.824 BaseBdev1 00:19:31.824 15:58:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:31.824 15:58:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:19:32.082 BaseBdev2_malloc 00:19:32.082 15:58:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:19:32.341 true 00:19:32.341 15:58:37 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:19:32.600 [2024-06-10 15:58:37.991158] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:19:32.600 [2024-06-10 15:58:37.991192] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:32.600 [2024-06-10 15:58:37.991211] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x285db50 00:19:32.600 [2024-06-10 15:58:37.991222] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:32.600 [2024-06-10 15:58:37.992674] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:32.600 [2024-06-10 15:58:37.992699] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:19:32.600 BaseBdev2 00:19:32.600 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:32.600 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:19:32.858 BaseBdev3_malloc 00:19:32.858 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:19:33.117 true 00:19:33.117 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:19:33.377 [2024-06-10 15:58:38.749672] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:19:33.377 [2024-06-10 15:58:38.749721] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:33.377 [2024-06-10 15:58:38.749737] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x285e780 00:19:33.377 [2024-06-10 15:58:38.749746] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:33.377 [2024-06-10 15:58:38.751217] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:33.377 [2024-06-10 15:58:38.751242] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:19:33.377 BaseBdev3 00:19:33.377 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:19:33.377 15:58:38 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:19:33.636 BaseBdev4_malloc 00:19:33.636 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:19:33.962 true 00:19:33.962 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:19:34.221 [2024-06-10 15:58:39.524026] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:19:34.221 [2024-06-10 15:58:39.524065] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:19:34.221 [2024-06-10 15:58:39.524084] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2857ee0 00:19:34.221 [2024-06-10 15:58:39.524093] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:19:34.221 [2024-06-10 15:58:39.525613] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:19:34.221 [2024-06-10 15:58:39.525639] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:19:34.221 BaseBdev4 00:19:34.221 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -z 64 -r concat -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:19:34.480 [2024-06-10 15:58:39.776834] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:34.480 [2024-06-10 15:58:39.778123] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:34.480 [2024-06-10 15:58:39.778193] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:34.480 [2024-06-10 15:58:39.778256] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:34.480 [2024-06-10 15:58:39.778495] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28615f0 00:19:34.480 [2024-06-10 15:58:39.778505] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 253952, blocklen 512 00:19:34.480 [2024-06-10 15:58:39.778688] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x26a8510 00:19:34.480 [2024-06-10 15:58:39.778843] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28615f0 00:19:34.480 [2024-06-10 15:58:39.778851] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28615f0 00:19:34.480 [2024-06-10 15:58:39.778952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:34.480 15:58:39 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:34.740 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:34.740 "name": "raid_bdev1", 00:19:34.740 "uuid": "91b8dbf8-e9da-4d1a-95fe-bc0f645d4c97", 00:19:34.740 "strip_size_kb": 64, 00:19:34.740 "state": "online", 00:19:34.740 "raid_level": "concat", 00:19:34.740 "superblock": true, 00:19:34.740 "num_base_bdevs": 4, 00:19:34.740 "num_base_bdevs_discovered": 4, 00:19:34.740 "num_base_bdevs_operational": 4, 00:19:34.740 "base_bdevs_list": [ 00:19:34.740 { 00:19:34.740 "name": "BaseBdev1", 00:19:34.740 "uuid": "3f329afd-8793-5569-a848-ce6ea6a67628", 00:19:34.740 "is_configured": true, 00:19:34.740 "data_offset": 2048, 00:19:34.740 "data_size": 63488 00:19:34.740 }, 00:19:34.740 { 00:19:34.740 "name": "BaseBdev2", 00:19:34.740 "uuid": "52960edc-7f2a-53b9-b109-7daa57dbc359", 00:19:34.740 "is_configured": true, 00:19:34.740 "data_offset": 2048, 00:19:34.740 "data_size": 63488 00:19:34.740 }, 00:19:34.740 { 00:19:34.740 "name": "BaseBdev3", 00:19:34.740 "uuid": "db674a5d-2681-54fd-ad5f-050f6adf6193", 00:19:34.740 "is_configured": true, 00:19:34.740 "data_offset": 2048, 00:19:34.740 "data_size": 63488 00:19:34.740 }, 00:19:34.740 { 00:19:34.740 "name": "BaseBdev4", 00:19:34.740 "uuid": "4d9cc21a-43e3-5e4a-94d4-25679014ba99", 00:19:34.740 "is_configured": true, 00:19:34.740 "data_offset": 2048, 00:19:34.740 "data_size": 63488 00:19:34.740 } 00:19:34.740 ] 00:19:34.740 }' 00:19:34.740 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:34.740 15:58:40 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:35.308 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:19:35.308 15:58:40 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:19:35.308 [2024-06-10 15:58:40.815845] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x285a4a0 00:19:36.244 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ concat = \r\a\i\d\1 ]] 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online concat 64 4 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=concat 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=64 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:36.503 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:36.504 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:36.504 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:36.504 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:36.504 15:58:41 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:19:36.763 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:36.763 "name": "raid_bdev1", 00:19:36.763 "uuid": "91b8dbf8-e9da-4d1a-95fe-bc0f645d4c97", 00:19:36.763 "strip_size_kb": 64, 00:19:36.763 "state": "online", 00:19:36.763 "raid_level": "concat", 00:19:36.763 "superblock": true, 00:19:36.763 "num_base_bdevs": 4, 00:19:36.763 "num_base_bdevs_discovered": 4, 00:19:36.763 "num_base_bdevs_operational": 4, 00:19:36.763 "base_bdevs_list": [ 00:19:36.763 { 00:19:36.763 "name": "BaseBdev1", 00:19:36.763 "uuid": "3f329afd-8793-5569-a848-ce6ea6a67628", 00:19:36.763 "is_configured": true, 00:19:36.763 "data_offset": 2048, 00:19:36.763 "data_size": 63488 00:19:36.763 }, 00:19:36.763 { 00:19:36.763 "name": "BaseBdev2", 00:19:36.763 "uuid": "52960edc-7f2a-53b9-b109-7daa57dbc359", 00:19:36.763 "is_configured": true, 00:19:36.763 "data_offset": 2048, 00:19:36.763 "data_size": 63488 00:19:36.763 }, 00:19:36.763 { 00:19:36.763 "name": "BaseBdev3", 00:19:36.763 "uuid": "db674a5d-2681-54fd-ad5f-050f6adf6193", 00:19:36.763 "is_configured": true, 00:19:36.763 "data_offset": 2048, 00:19:36.763 "data_size": 63488 00:19:36.763 }, 00:19:36.763 { 00:19:36.763 "name": "BaseBdev4", 00:19:36.763 "uuid": "4d9cc21a-43e3-5e4a-94d4-25679014ba99", 00:19:36.763 "is_configured": true, 00:19:36.763 "data_offset": 2048, 00:19:36.763 "data_size": 63488 00:19:36.763 } 00:19:36.763 ] 00:19:36.763 }' 00:19:36.763 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:36.763 15:58:42 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.700 15:58:42 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:19:37.700 [2024-06-10 15:58:43.098400] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:19:37.700 [2024-06-10 15:58:43.098441] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:37.700 [2024-06-10 15:58:43.101855] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:37.700 [2024-06-10 15:58:43.101895] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:37.700 [2024-06-10 15:58:43.101934] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:37.700 [2024-06-10 15:58:43.101943] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28615f0 name raid_bdev1, state offline 00:19:37.700 0 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2741414 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2741414 ']' 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2741414 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2741414 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2741414' 00:19:37.700 killing process with pid 2741414 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2741414 00:19:37.700 [2024-06-10 15:58:43.161104] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:19:37.700 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2741414 00:19:37.700 [2024-06-10 15:58:43.189707] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.zaYzqpl8Oa 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.44 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy concat 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@215 -- # return 1 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@847 -- # [[ 0.44 != \0\.\0\0 ]] 00:19:37.959 00:19:37.959 real 0m7.969s 00:19:37.959 user 0m13.138s 00:19:37.959 sys 0m1.105s 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:19:37.959 15:58:43 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:19:37.959 ************************************ 00:19:37.959 END TEST raid_write_error_test 00:19:37.959 ************************************ 00:19:37.959 15:58:43 bdev_raid -- bdev/bdev_raid.sh@866 -- # for level in raid0 concat raid1 00:19:37.959 15:58:43 bdev_raid -- bdev/bdev_raid.sh@867 -- # run_test raid_state_function_test raid_state_function_test raid1 4 false 00:19:37.959 15:58:43 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:19:37.959 15:58:43 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:19:37.959 15:58:43 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:19:38.218 ************************************ 00:19:38.218 START TEST raid_state_function_test 00:19:38.218 ************************************ 00:19:38.218 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 false 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@222 -- # local superblock=false 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@226 -- # local strip_size 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@237 -- # '[' false = true ']' 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@240 -- # superblock_create_arg= 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@244 -- # raid_pid=2742713 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2742713' 00:19:38.219 Process raid pid: 2742713 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@246 -- # waitforlisten 2742713 /var/tmp/spdk-raid.sock 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@830 -- # '[' -z 2742713 ']' 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:19:38.219 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:19:38.219 15:58:43 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:38.219 [2024-06-10 15:58:43.532941] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:19:38.219 [2024-06-10 15:58:43.533000] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:19:38.219 [2024-06-10 15:58:43.634449] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:19:38.478 [2024-06-10 15:58:43.729353] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:19:38.478 [2024-06-10 15:58:43.800974] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:38.478 [2024-06-10 15:58:43.801001] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:19:39.046 15:58:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:19:39.046 15:58:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@863 -- # return 0 00:19:39.046 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:39.305 [2024-06-10 15:58:44.703771] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:39.305 [2024-06-10 15:58:44.703810] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:39.305 [2024-06-10 15:58:44.703819] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:39.305 [2024-06-10 15:58:44.703827] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:39.305 [2024-06-10 15:58:44.703834] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:39.305 [2024-06-10 15:58:44.703843] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:39.305 [2024-06-10 15:58:44.703849] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:39.305 [2024-06-10 15:58:44.703857] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:39.305 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:39.564 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:39.564 "name": "Existed_Raid", 00:19:39.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.564 "strip_size_kb": 0, 00:19:39.564 "state": "configuring", 00:19:39.564 "raid_level": "raid1", 00:19:39.564 "superblock": false, 00:19:39.564 "num_base_bdevs": 4, 00:19:39.564 "num_base_bdevs_discovered": 0, 00:19:39.564 "num_base_bdevs_operational": 4, 00:19:39.564 "base_bdevs_list": [ 00:19:39.564 { 00:19:39.564 "name": "BaseBdev1", 00:19:39.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.564 "is_configured": false, 00:19:39.564 "data_offset": 0, 00:19:39.564 "data_size": 0 00:19:39.564 }, 00:19:39.564 { 00:19:39.564 "name": "BaseBdev2", 00:19:39.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.564 "is_configured": false, 00:19:39.564 "data_offset": 0, 00:19:39.564 "data_size": 0 00:19:39.564 }, 00:19:39.564 { 00:19:39.564 "name": "BaseBdev3", 00:19:39.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.564 "is_configured": false, 00:19:39.564 "data_offset": 0, 00:19:39.564 "data_size": 0 00:19:39.564 }, 00:19:39.564 { 00:19:39.564 "name": "BaseBdev4", 00:19:39.564 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:39.564 "is_configured": false, 00:19:39.564 "data_offset": 0, 00:19:39.564 "data_size": 0 00:19:39.564 } 00:19:39.564 ] 00:19:39.564 }' 00:19:39.564 15:58:44 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:39.564 15:58:44 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:40.133 15:58:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:40.391 [2024-06-10 15:58:45.842667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:40.391 [2024-06-10 15:58:45.842698] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21ba140 name Existed_Raid, state configuring 00:19:40.391 15:58:45 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:40.650 [2024-06-10 15:58:46.099357] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:40.650 [2024-06-10 15:58:46.099379] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:40.650 [2024-06-10 15:58:46.099387] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:40.650 [2024-06-10 15:58:46.099395] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:40.650 [2024-06-10 15:58:46.099402] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:40.650 [2024-06-10 15:58:46.099410] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:40.650 [2024-06-10 15:58:46.099418] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:40.650 [2024-06-10 15:58:46.099425] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:40.650 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:40.908 [2024-06-10 15:58:46.365515] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:40.908 BaseBdev1 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:40.908 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:41.166 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:41.424 [ 00:19:41.424 { 00:19:41.424 "name": "BaseBdev1", 00:19:41.424 "aliases": [ 00:19:41.424 "db8c2a95-2ad7-41c0-8480-405745f5583f" 00:19:41.424 ], 00:19:41.424 "product_name": "Malloc disk", 00:19:41.424 "block_size": 512, 00:19:41.424 "num_blocks": 65536, 00:19:41.424 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:41.424 "assigned_rate_limits": { 00:19:41.424 "rw_ios_per_sec": 0, 00:19:41.424 "rw_mbytes_per_sec": 0, 00:19:41.424 "r_mbytes_per_sec": 0, 00:19:41.424 "w_mbytes_per_sec": 0 00:19:41.424 }, 00:19:41.424 "claimed": true, 00:19:41.424 "claim_type": "exclusive_write", 00:19:41.424 "zoned": false, 00:19:41.424 "supported_io_types": { 00:19:41.424 "read": true, 00:19:41.424 "write": true, 00:19:41.424 "unmap": true, 00:19:41.424 "write_zeroes": true, 00:19:41.425 "flush": true, 00:19:41.425 "reset": true, 00:19:41.425 "compare": false, 00:19:41.425 "compare_and_write": false, 00:19:41.425 "abort": true, 00:19:41.425 "nvme_admin": false, 00:19:41.425 "nvme_io": false 00:19:41.425 }, 00:19:41.425 "memory_domains": [ 00:19:41.425 { 00:19:41.425 "dma_device_id": "system", 00:19:41.425 "dma_device_type": 1 00:19:41.425 }, 00:19:41.425 { 00:19:41.425 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:41.425 "dma_device_type": 2 00:19:41.425 } 00:19:41.425 ], 00:19:41.425 "driver_specific": {} 00:19:41.425 } 00:19:41.425 ] 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:41.425 15:58:46 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:41.683 15:58:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:41.683 "name": "Existed_Raid", 00:19:41.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.683 "strip_size_kb": 0, 00:19:41.683 "state": "configuring", 00:19:41.683 "raid_level": "raid1", 00:19:41.683 "superblock": false, 00:19:41.683 "num_base_bdevs": 4, 00:19:41.683 "num_base_bdevs_discovered": 1, 00:19:41.683 "num_base_bdevs_operational": 4, 00:19:41.683 "base_bdevs_list": [ 00:19:41.683 { 00:19:41.683 "name": "BaseBdev1", 00:19:41.683 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:41.683 "is_configured": true, 00:19:41.683 "data_offset": 0, 00:19:41.683 "data_size": 65536 00:19:41.683 }, 00:19:41.683 { 00:19:41.683 "name": "BaseBdev2", 00:19:41.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.683 "is_configured": false, 00:19:41.683 "data_offset": 0, 00:19:41.683 "data_size": 0 00:19:41.683 }, 00:19:41.683 { 00:19:41.683 "name": "BaseBdev3", 00:19:41.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.683 "is_configured": false, 00:19:41.683 "data_offset": 0, 00:19:41.683 "data_size": 0 00:19:41.683 }, 00:19:41.683 { 00:19:41.683 "name": "BaseBdev4", 00:19:41.683 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:41.683 "is_configured": false, 00:19:41.683 "data_offset": 0, 00:19:41.683 "data_size": 0 00:19:41.683 } 00:19:41.683 ] 00:19:41.683 }' 00:19:41.683 15:58:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:41.683 15:58:47 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:42.619 15:58:47 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:19:42.619 [2024-06-10 15:58:48.013910] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:19:42.619 [2024-06-10 15:58:48.013946] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21b99b0 name Existed_Raid, state configuring 00:19:42.619 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:42.878 [2024-06-10 15:58:48.230517] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:42.878 [2024-06-10 15:58:48.232091] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:19:42.878 [2024-06-10 15:58:48.232120] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:19:42.878 [2024-06-10 15:58:48.232128] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:19:42.878 [2024-06-10 15:58:48.232137] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:19:42.878 [2024-06-10 15:58:48.232144] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:19:42.878 [2024-06-10 15:58:48.232153] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:42.878 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:43.137 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:43.137 "name": "Existed_Raid", 00:19:43.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.137 "strip_size_kb": 0, 00:19:43.137 "state": "configuring", 00:19:43.137 "raid_level": "raid1", 00:19:43.137 "superblock": false, 00:19:43.137 "num_base_bdevs": 4, 00:19:43.137 "num_base_bdevs_discovered": 1, 00:19:43.137 "num_base_bdevs_operational": 4, 00:19:43.137 "base_bdevs_list": [ 00:19:43.137 { 00:19:43.137 "name": "BaseBdev1", 00:19:43.137 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:43.137 "is_configured": true, 00:19:43.137 "data_offset": 0, 00:19:43.137 "data_size": 65536 00:19:43.137 }, 00:19:43.137 { 00:19:43.137 "name": "BaseBdev2", 00:19:43.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.137 "is_configured": false, 00:19:43.137 "data_offset": 0, 00:19:43.137 "data_size": 0 00:19:43.137 }, 00:19:43.137 { 00:19:43.137 "name": "BaseBdev3", 00:19:43.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.137 "is_configured": false, 00:19:43.137 "data_offset": 0, 00:19:43.137 "data_size": 0 00:19:43.137 }, 00:19:43.137 { 00:19:43.137 "name": "BaseBdev4", 00:19:43.137 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:43.137 "is_configured": false, 00:19:43.137 "data_offset": 0, 00:19:43.137 "data_size": 0 00:19:43.137 } 00:19:43.137 ] 00:19:43.137 }' 00:19:43.137 15:58:48 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:43.137 15:58:48 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:43.705 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:43.963 [2024-06-10 15:58:49.380739] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:43.963 BaseBdev2 00:19:43.963 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:19:43.963 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:19:43.963 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:43.963 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:43.963 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:43.964 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:43.964 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:44.222 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:44.481 [ 00:19:44.481 { 00:19:44.481 "name": "BaseBdev2", 00:19:44.481 "aliases": [ 00:19:44.481 "4ce28877-b3e9-49ab-b3ea-04957e9473ab" 00:19:44.481 ], 00:19:44.481 "product_name": "Malloc disk", 00:19:44.481 "block_size": 512, 00:19:44.481 "num_blocks": 65536, 00:19:44.481 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:44.481 "assigned_rate_limits": { 00:19:44.481 "rw_ios_per_sec": 0, 00:19:44.481 "rw_mbytes_per_sec": 0, 00:19:44.481 "r_mbytes_per_sec": 0, 00:19:44.481 "w_mbytes_per_sec": 0 00:19:44.481 }, 00:19:44.482 "claimed": true, 00:19:44.482 "claim_type": "exclusive_write", 00:19:44.482 "zoned": false, 00:19:44.482 "supported_io_types": { 00:19:44.482 "read": true, 00:19:44.482 "write": true, 00:19:44.482 "unmap": true, 00:19:44.482 "write_zeroes": true, 00:19:44.482 "flush": true, 00:19:44.482 "reset": true, 00:19:44.482 "compare": false, 00:19:44.482 "compare_and_write": false, 00:19:44.482 "abort": true, 00:19:44.482 "nvme_admin": false, 00:19:44.482 "nvme_io": false 00:19:44.482 }, 00:19:44.482 "memory_domains": [ 00:19:44.482 { 00:19:44.482 "dma_device_id": "system", 00:19:44.482 "dma_device_type": 1 00:19:44.482 }, 00:19:44.482 { 00:19:44.482 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:44.482 "dma_device_type": 2 00:19:44.482 } 00:19:44.482 ], 00:19:44.482 "driver_specific": {} 00:19:44.482 } 00:19:44.482 ] 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:44.482 15:58:49 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:44.741 15:58:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:44.741 "name": "Existed_Raid", 00:19:44.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.741 "strip_size_kb": 0, 00:19:44.741 "state": "configuring", 00:19:44.741 "raid_level": "raid1", 00:19:44.741 "superblock": false, 00:19:44.741 "num_base_bdevs": 4, 00:19:44.741 "num_base_bdevs_discovered": 2, 00:19:44.741 "num_base_bdevs_operational": 4, 00:19:44.741 "base_bdevs_list": [ 00:19:44.741 { 00:19:44.741 "name": "BaseBdev1", 00:19:44.741 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:44.741 "is_configured": true, 00:19:44.741 "data_offset": 0, 00:19:44.741 "data_size": 65536 00:19:44.741 }, 00:19:44.741 { 00:19:44.741 "name": "BaseBdev2", 00:19:44.741 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:44.741 "is_configured": true, 00:19:44.741 "data_offset": 0, 00:19:44.741 "data_size": 65536 00:19:44.741 }, 00:19:44.741 { 00:19:44.741 "name": "BaseBdev3", 00:19:44.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.741 "is_configured": false, 00:19:44.741 "data_offset": 0, 00:19:44.741 "data_size": 0 00:19:44.741 }, 00:19:44.741 { 00:19:44.741 "name": "BaseBdev4", 00:19:44.741 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:44.741 "is_configured": false, 00:19:44.741 "data_offset": 0, 00:19:44.741 "data_size": 0 00:19:44.741 } 00:19:44.741 ] 00:19:44.741 }' 00:19:44.741 15:58:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:44.741 15:58:50 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:45.309 15:58:50 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:45.568 [2024-06-10 15:58:51.060427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:45.568 BaseBdev3 00:19:45.568 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:19:45.568 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:45.828 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:46.087 [ 00:19:46.087 { 00:19:46.087 "name": "BaseBdev3", 00:19:46.087 "aliases": [ 00:19:46.087 "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44" 00:19:46.087 ], 00:19:46.087 "product_name": "Malloc disk", 00:19:46.087 "block_size": 512, 00:19:46.087 "num_blocks": 65536, 00:19:46.087 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:46.087 "assigned_rate_limits": { 00:19:46.087 "rw_ios_per_sec": 0, 00:19:46.087 "rw_mbytes_per_sec": 0, 00:19:46.087 "r_mbytes_per_sec": 0, 00:19:46.087 "w_mbytes_per_sec": 0 00:19:46.087 }, 00:19:46.087 "claimed": true, 00:19:46.087 "claim_type": "exclusive_write", 00:19:46.087 "zoned": false, 00:19:46.087 "supported_io_types": { 00:19:46.087 "read": true, 00:19:46.087 "write": true, 00:19:46.087 "unmap": true, 00:19:46.087 "write_zeroes": true, 00:19:46.087 "flush": true, 00:19:46.087 "reset": true, 00:19:46.087 "compare": false, 00:19:46.087 "compare_and_write": false, 00:19:46.087 "abort": true, 00:19:46.087 "nvme_admin": false, 00:19:46.087 "nvme_io": false 00:19:46.087 }, 00:19:46.087 "memory_domains": [ 00:19:46.087 { 00:19:46.087 "dma_device_id": "system", 00:19:46.087 "dma_device_type": 1 00:19:46.087 }, 00:19:46.087 { 00:19:46.087 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:46.087 "dma_device_type": 2 00:19:46.087 } 00:19:46.087 ], 00:19:46.087 "driver_specific": {} 00:19:46.087 } 00:19:46.087 ] 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:46.087 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:46.346 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:46.346 "name": "Existed_Raid", 00:19:46.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.346 "strip_size_kb": 0, 00:19:46.346 "state": "configuring", 00:19:46.346 "raid_level": "raid1", 00:19:46.346 "superblock": false, 00:19:46.346 "num_base_bdevs": 4, 00:19:46.346 "num_base_bdevs_discovered": 3, 00:19:46.346 "num_base_bdevs_operational": 4, 00:19:46.346 "base_bdevs_list": [ 00:19:46.346 { 00:19:46.346 "name": "BaseBdev1", 00:19:46.346 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:46.346 "is_configured": true, 00:19:46.346 "data_offset": 0, 00:19:46.346 "data_size": 65536 00:19:46.346 }, 00:19:46.346 { 00:19:46.346 "name": "BaseBdev2", 00:19:46.346 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:46.346 "is_configured": true, 00:19:46.346 "data_offset": 0, 00:19:46.346 "data_size": 65536 00:19:46.346 }, 00:19:46.346 { 00:19:46.346 "name": "BaseBdev3", 00:19:46.346 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:46.346 "is_configured": true, 00:19:46.346 "data_offset": 0, 00:19:46.346 "data_size": 65536 00:19:46.346 }, 00:19:46.346 { 00:19:46.346 "name": "BaseBdev4", 00:19:46.346 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:46.346 "is_configured": false, 00:19:46.346 "data_offset": 0, 00:19:46.346 "data_size": 0 00:19:46.346 } 00:19:46.346 ] 00:19:46.346 }' 00:19:46.346 15:58:51 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:46.346 15:58:51 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:47.282 [2024-06-10 15:58:52.708039] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:47.282 [2024-06-10 15:58:52.708073] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x21baac0 00:19:47.282 [2024-06-10 15:58:52.708080] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:19:47.282 [2024-06-10 15:58:52.708281] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x235fba0 00:19:47.282 [2024-06-10 15:58:52.708416] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x21baac0 00:19:47.282 [2024-06-10 15:58:52.708425] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x21baac0 00:19:47.282 [2024-06-10 15:58:52.708590] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:19:47.282 BaseBdev4 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:47.282 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:47.541 15:58:52 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:47.800 [ 00:19:47.800 { 00:19:47.800 "name": "BaseBdev4", 00:19:47.800 "aliases": [ 00:19:47.800 "29baf6b0-b214-4da8-93ac-bd19ea1bf41c" 00:19:47.800 ], 00:19:47.800 "product_name": "Malloc disk", 00:19:47.800 "block_size": 512, 00:19:47.800 "num_blocks": 65536, 00:19:47.800 "uuid": "29baf6b0-b214-4da8-93ac-bd19ea1bf41c", 00:19:47.800 "assigned_rate_limits": { 00:19:47.800 "rw_ios_per_sec": 0, 00:19:47.800 "rw_mbytes_per_sec": 0, 00:19:47.800 "r_mbytes_per_sec": 0, 00:19:47.800 "w_mbytes_per_sec": 0 00:19:47.800 }, 00:19:47.800 "claimed": true, 00:19:47.800 "claim_type": "exclusive_write", 00:19:47.800 "zoned": false, 00:19:47.800 "supported_io_types": { 00:19:47.800 "read": true, 00:19:47.800 "write": true, 00:19:47.800 "unmap": true, 00:19:47.800 "write_zeroes": true, 00:19:47.800 "flush": true, 00:19:47.800 "reset": true, 00:19:47.800 "compare": false, 00:19:47.800 "compare_and_write": false, 00:19:47.800 "abort": true, 00:19:47.800 "nvme_admin": false, 00:19:47.800 "nvme_io": false 00:19:47.800 }, 00:19:47.800 "memory_domains": [ 00:19:47.800 { 00:19:47.800 "dma_device_id": "system", 00:19:47.800 "dma_device_type": 1 00:19:47.800 }, 00:19:47.800 { 00:19:47.800 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:47.800 "dma_device_type": 2 00:19:47.800 } 00:19:47.800 ], 00:19:47.800 "driver_specific": {} 00:19:47.800 } 00:19:47.800 ] 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:47.800 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:48.059 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:48.059 "name": "Existed_Raid", 00:19:48.059 "uuid": "e8aa6695-ece2-4977-bb4a-98ededdd8a3e", 00:19:48.059 "strip_size_kb": 0, 00:19:48.059 "state": "online", 00:19:48.059 "raid_level": "raid1", 00:19:48.059 "superblock": false, 00:19:48.059 "num_base_bdevs": 4, 00:19:48.059 "num_base_bdevs_discovered": 4, 00:19:48.059 "num_base_bdevs_operational": 4, 00:19:48.059 "base_bdevs_list": [ 00:19:48.059 { 00:19:48.059 "name": "BaseBdev1", 00:19:48.059 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:48.059 "is_configured": true, 00:19:48.059 "data_offset": 0, 00:19:48.059 "data_size": 65536 00:19:48.059 }, 00:19:48.059 { 00:19:48.059 "name": "BaseBdev2", 00:19:48.059 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:48.059 "is_configured": true, 00:19:48.059 "data_offset": 0, 00:19:48.059 "data_size": 65536 00:19:48.059 }, 00:19:48.059 { 00:19:48.059 "name": "BaseBdev3", 00:19:48.059 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:48.059 "is_configured": true, 00:19:48.059 "data_offset": 0, 00:19:48.059 "data_size": 65536 00:19:48.059 }, 00:19:48.059 { 00:19:48.059 "name": "BaseBdev4", 00:19:48.059 "uuid": "29baf6b0-b214-4da8-93ac-bd19ea1bf41c", 00:19:48.059 "is_configured": true, 00:19:48.059 "data_offset": 0, 00:19:48.059 "data_size": 65536 00:19:48.059 } 00:19:48.059 ] 00:19:48.059 }' 00:19:48.059 15:58:53 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:48.059 15:58:53 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:19:48.627 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:19:48.886 [2024-06-10 15:58:54.348760] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:19:48.886 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:19:48.886 "name": "Existed_Raid", 00:19:48.886 "aliases": [ 00:19:48.886 "e8aa6695-ece2-4977-bb4a-98ededdd8a3e" 00:19:48.886 ], 00:19:48.886 "product_name": "Raid Volume", 00:19:48.886 "block_size": 512, 00:19:48.886 "num_blocks": 65536, 00:19:48.886 "uuid": "e8aa6695-ece2-4977-bb4a-98ededdd8a3e", 00:19:48.886 "assigned_rate_limits": { 00:19:48.886 "rw_ios_per_sec": 0, 00:19:48.886 "rw_mbytes_per_sec": 0, 00:19:48.886 "r_mbytes_per_sec": 0, 00:19:48.886 "w_mbytes_per_sec": 0 00:19:48.886 }, 00:19:48.886 "claimed": false, 00:19:48.886 "zoned": false, 00:19:48.886 "supported_io_types": { 00:19:48.886 "read": true, 00:19:48.886 "write": true, 00:19:48.886 "unmap": false, 00:19:48.886 "write_zeroes": true, 00:19:48.886 "flush": false, 00:19:48.886 "reset": true, 00:19:48.886 "compare": false, 00:19:48.886 "compare_and_write": false, 00:19:48.886 "abort": false, 00:19:48.886 "nvme_admin": false, 00:19:48.886 "nvme_io": false 00:19:48.886 }, 00:19:48.886 "memory_domains": [ 00:19:48.886 { 00:19:48.886 "dma_device_id": "system", 00:19:48.886 "dma_device_type": 1 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.886 "dma_device_type": 2 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "system", 00:19:48.886 "dma_device_type": 1 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.886 "dma_device_type": 2 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "system", 00:19:48.886 "dma_device_type": 1 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.886 "dma_device_type": 2 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "system", 00:19:48.886 "dma_device_type": 1 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:48.886 "dma_device_type": 2 00:19:48.886 } 00:19:48.886 ], 00:19:48.886 "driver_specific": { 00:19:48.886 "raid": { 00:19:48.886 "uuid": "e8aa6695-ece2-4977-bb4a-98ededdd8a3e", 00:19:48.886 "strip_size_kb": 0, 00:19:48.886 "state": "online", 00:19:48.886 "raid_level": "raid1", 00:19:48.886 "superblock": false, 00:19:48.886 "num_base_bdevs": 4, 00:19:48.886 "num_base_bdevs_discovered": 4, 00:19:48.886 "num_base_bdevs_operational": 4, 00:19:48.886 "base_bdevs_list": [ 00:19:48.886 { 00:19:48.886 "name": "BaseBdev1", 00:19:48.886 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:48.886 "is_configured": true, 00:19:48.886 "data_offset": 0, 00:19:48.886 "data_size": 65536 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "name": "BaseBdev2", 00:19:48.886 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:48.886 "is_configured": true, 00:19:48.886 "data_offset": 0, 00:19:48.886 "data_size": 65536 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "name": "BaseBdev3", 00:19:48.886 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:48.886 "is_configured": true, 00:19:48.886 "data_offset": 0, 00:19:48.886 "data_size": 65536 00:19:48.886 }, 00:19:48.886 { 00:19:48.886 "name": "BaseBdev4", 00:19:48.886 "uuid": "29baf6b0-b214-4da8-93ac-bd19ea1bf41c", 00:19:48.886 "is_configured": true, 00:19:48.886 "data_offset": 0, 00:19:48.886 "data_size": 65536 00:19:48.886 } 00:19:48.886 ] 00:19:48.886 } 00:19:48.886 } 00:19:48.886 }' 00:19:48.886 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:19:49.145 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:19:49.145 BaseBdev2 00:19:49.145 BaseBdev3 00:19:49.145 BaseBdev4' 00:19:49.145 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:49.145 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:19:49.145 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.404 "name": "BaseBdev1", 00:19:49.404 "aliases": [ 00:19:49.404 "db8c2a95-2ad7-41c0-8480-405745f5583f" 00:19:49.404 ], 00:19:49.404 "product_name": "Malloc disk", 00:19:49.404 "block_size": 512, 00:19:49.404 "num_blocks": 65536, 00:19:49.404 "uuid": "db8c2a95-2ad7-41c0-8480-405745f5583f", 00:19:49.404 "assigned_rate_limits": { 00:19:49.404 "rw_ios_per_sec": 0, 00:19:49.404 "rw_mbytes_per_sec": 0, 00:19:49.404 "r_mbytes_per_sec": 0, 00:19:49.404 "w_mbytes_per_sec": 0 00:19:49.404 }, 00:19:49.404 "claimed": true, 00:19:49.404 "claim_type": "exclusive_write", 00:19:49.404 "zoned": false, 00:19:49.404 "supported_io_types": { 00:19:49.404 "read": true, 00:19:49.404 "write": true, 00:19:49.404 "unmap": true, 00:19:49.404 "write_zeroes": true, 00:19:49.404 "flush": true, 00:19:49.404 "reset": true, 00:19:49.404 "compare": false, 00:19:49.404 "compare_and_write": false, 00:19:49.404 "abort": true, 00:19:49.404 "nvme_admin": false, 00:19:49.404 "nvme_io": false 00:19:49.404 }, 00:19:49.404 "memory_domains": [ 00:19:49.404 { 00:19:49.404 "dma_device_id": "system", 00:19:49.404 "dma_device_type": 1 00:19:49.404 }, 00:19:49.404 { 00:19:49.404 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.404 "dma_device_type": 2 00:19:49.404 } 00:19:49.404 ], 00:19:49.404 "driver_specific": {} 00:19:49.404 }' 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.404 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:49.663 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:49.663 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.663 15:58:54 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:49.663 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:49.663 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:49.663 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:19:49.663 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:49.922 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:49.922 "name": "BaseBdev2", 00:19:49.922 "aliases": [ 00:19:49.922 "4ce28877-b3e9-49ab-b3ea-04957e9473ab" 00:19:49.922 ], 00:19:49.922 "product_name": "Malloc disk", 00:19:49.922 "block_size": 512, 00:19:49.922 "num_blocks": 65536, 00:19:49.922 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:49.922 "assigned_rate_limits": { 00:19:49.922 "rw_ios_per_sec": 0, 00:19:49.922 "rw_mbytes_per_sec": 0, 00:19:49.922 "r_mbytes_per_sec": 0, 00:19:49.922 "w_mbytes_per_sec": 0 00:19:49.922 }, 00:19:49.922 "claimed": true, 00:19:49.922 "claim_type": "exclusive_write", 00:19:49.922 "zoned": false, 00:19:49.922 "supported_io_types": { 00:19:49.922 "read": true, 00:19:49.922 "write": true, 00:19:49.922 "unmap": true, 00:19:49.922 "write_zeroes": true, 00:19:49.922 "flush": true, 00:19:49.922 "reset": true, 00:19:49.922 "compare": false, 00:19:49.922 "compare_and_write": false, 00:19:49.922 "abort": true, 00:19:49.922 "nvme_admin": false, 00:19:49.922 "nvme_io": false 00:19:49.922 }, 00:19:49.922 "memory_domains": [ 00:19:49.922 { 00:19:49.922 "dma_device_id": "system", 00:19:49.922 "dma_device_type": 1 00:19:49.922 }, 00:19:49.922 { 00:19:49.922 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:49.922 "dma_device_type": 2 00:19:49.922 } 00:19:49.922 ], 00:19:49.922 "driver_specific": {} 00:19:49.922 }' 00:19:49.922 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.922 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:49.922 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:49.922 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:19:50.181 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:50.440 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:50.440 "name": "BaseBdev3", 00:19:50.440 "aliases": [ 00:19:50.440 "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44" 00:19:50.440 ], 00:19:50.440 "product_name": "Malloc disk", 00:19:50.440 "block_size": 512, 00:19:50.440 "num_blocks": 65536, 00:19:50.440 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:50.440 "assigned_rate_limits": { 00:19:50.440 "rw_ios_per_sec": 0, 00:19:50.440 "rw_mbytes_per_sec": 0, 00:19:50.440 "r_mbytes_per_sec": 0, 00:19:50.440 "w_mbytes_per_sec": 0 00:19:50.440 }, 00:19:50.440 "claimed": true, 00:19:50.440 "claim_type": "exclusive_write", 00:19:50.440 "zoned": false, 00:19:50.440 "supported_io_types": { 00:19:50.440 "read": true, 00:19:50.440 "write": true, 00:19:50.440 "unmap": true, 00:19:50.440 "write_zeroes": true, 00:19:50.440 "flush": true, 00:19:50.440 "reset": true, 00:19:50.440 "compare": false, 00:19:50.440 "compare_and_write": false, 00:19:50.440 "abort": true, 00:19:50.440 "nvme_admin": false, 00:19:50.440 "nvme_io": false 00:19:50.440 }, 00:19:50.440 "memory_domains": [ 00:19:50.440 { 00:19:50.440 "dma_device_id": "system", 00:19:50.440 "dma_device_type": 1 00:19:50.440 }, 00:19:50.440 { 00:19:50.440 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:50.440 "dma_device_type": 2 00:19:50.440 } 00:19:50.440 ], 00:19:50.440 "driver_specific": {} 00:19:50.440 }' 00:19:50.440 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.698 15:58:55 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:50.698 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.972 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:50.972 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:50.972 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:19:50.972 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:19:50.972 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:19:51.246 "name": "BaseBdev4", 00:19:51.246 "aliases": [ 00:19:51.246 "29baf6b0-b214-4da8-93ac-bd19ea1bf41c" 00:19:51.246 ], 00:19:51.246 "product_name": "Malloc disk", 00:19:51.246 "block_size": 512, 00:19:51.246 "num_blocks": 65536, 00:19:51.246 "uuid": "29baf6b0-b214-4da8-93ac-bd19ea1bf41c", 00:19:51.246 "assigned_rate_limits": { 00:19:51.246 "rw_ios_per_sec": 0, 00:19:51.246 "rw_mbytes_per_sec": 0, 00:19:51.246 "r_mbytes_per_sec": 0, 00:19:51.246 "w_mbytes_per_sec": 0 00:19:51.246 }, 00:19:51.246 "claimed": true, 00:19:51.246 "claim_type": "exclusive_write", 00:19:51.246 "zoned": false, 00:19:51.246 "supported_io_types": { 00:19:51.246 "read": true, 00:19:51.246 "write": true, 00:19:51.246 "unmap": true, 00:19:51.246 "write_zeroes": true, 00:19:51.246 "flush": true, 00:19:51.246 "reset": true, 00:19:51.246 "compare": false, 00:19:51.246 "compare_and_write": false, 00:19:51.246 "abort": true, 00:19:51.246 "nvme_admin": false, 00:19:51.246 "nvme_io": false 00:19:51.246 }, 00:19:51.246 "memory_domains": [ 00:19:51.246 { 00:19:51.246 "dma_device_id": "system", 00:19:51.246 "dma_device_type": 1 00:19:51.246 }, 00:19:51.246 { 00:19:51.246 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:51.246 "dma_device_type": 2 00:19:51.246 } 00:19:51.246 ], 00:19:51.246 "driver_specific": {} 00:19:51.246 }' 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:19:51.246 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:19:51.505 15:58:56 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:19:51.765 [2024-06-10 15:58:57.148007] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@275 -- # local expected_state 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@214 -- # return 0 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:51.765 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:52.024 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:52.024 "name": "Existed_Raid", 00:19:52.024 "uuid": "e8aa6695-ece2-4977-bb4a-98ededdd8a3e", 00:19:52.024 "strip_size_kb": 0, 00:19:52.024 "state": "online", 00:19:52.024 "raid_level": "raid1", 00:19:52.024 "superblock": false, 00:19:52.024 "num_base_bdevs": 4, 00:19:52.024 "num_base_bdevs_discovered": 3, 00:19:52.024 "num_base_bdevs_operational": 3, 00:19:52.024 "base_bdevs_list": [ 00:19:52.024 { 00:19:52.024 "name": null, 00:19:52.024 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:52.024 "is_configured": false, 00:19:52.024 "data_offset": 0, 00:19:52.024 "data_size": 65536 00:19:52.024 }, 00:19:52.024 { 00:19:52.024 "name": "BaseBdev2", 00:19:52.024 "uuid": "4ce28877-b3e9-49ab-b3ea-04957e9473ab", 00:19:52.024 "is_configured": true, 00:19:52.024 "data_offset": 0, 00:19:52.024 "data_size": 65536 00:19:52.024 }, 00:19:52.024 { 00:19:52.024 "name": "BaseBdev3", 00:19:52.024 "uuid": "ebc8d2a5-8144-4dd6-bf19-e9d51c65ee44", 00:19:52.024 "is_configured": true, 00:19:52.024 "data_offset": 0, 00:19:52.024 "data_size": 65536 00:19:52.024 }, 00:19:52.024 { 00:19:52.024 "name": "BaseBdev4", 00:19:52.024 "uuid": "29baf6b0-b214-4da8-93ac-bd19ea1bf41c", 00:19:52.024 "is_configured": true, 00:19:52.024 "data_offset": 0, 00:19:52.024 "data_size": 65536 00:19:52.024 } 00:19:52.024 ] 00:19:52.024 }' 00:19:52.024 15:58:57 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:52.024 15:58:57 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:52.591 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:19:52.591 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:52.591 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:52.591 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:52.850 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:52.850 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:52.850 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:19:53.109 [2024-06-10 15:58:58.440670] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:53.109 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:53.109 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:53.109 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.109 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:53.367 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:53.367 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:53.367 15:58:58 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:19:53.626 [2024-06-10 15:58:58.968139] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:19:53.627 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:53.627 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:53.627 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:53.627 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:19:53.886 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:19:53.886 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:19:53.886 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:19:54.145 [2024-06-10 15:58:59.491803] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:19:54.145 [2024-06-10 15:58:59.491877] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:19:54.145 [2024-06-10 15:58:59.502415] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:19:54.145 [2024-06-10 15:58:59.502447] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:19:54.145 [2024-06-10 15:58:59.502463] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x21baac0 name Existed_Raid, state offline 00:19:54.145 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:19:54.145 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:19:54.145 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:54.145 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:54.404 15:58:59 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:19:54.663 BaseBdev2 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:54.663 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:54.922 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:19:55.182 [ 00:19:55.182 { 00:19:55.182 "name": "BaseBdev2", 00:19:55.182 "aliases": [ 00:19:55.182 "0da96fb9-4995-4527-8dba-76716c67daf3" 00:19:55.182 ], 00:19:55.182 "product_name": "Malloc disk", 00:19:55.182 "block_size": 512, 00:19:55.182 "num_blocks": 65536, 00:19:55.182 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:19:55.182 "assigned_rate_limits": { 00:19:55.182 "rw_ios_per_sec": 0, 00:19:55.182 "rw_mbytes_per_sec": 0, 00:19:55.182 "r_mbytes_per_sec": 0, 00:19:55.182 "w_mbytes_per_sec": 0 00:19:55.182 }, 00:19:55.182 "claimed": false, 00:19:55.182 "zoned": false, 00:19:55.182 "supported_io_types": { 00:19:55.182 "read": true, 00:19:55.182 "write": true, 00:19:55.182 "unmap": true, 00:19:55.182 "write_zeroes": true, 00:19:55.182 "flush": true, 00:19:55.182 "reset": true, 00:19:55.182 "compare": false, 00:19:55.182 "compare_and_write": false, 00:19:55.182 "abort": true, 00:19:55.182 "nvme_admin": false, 00:19:55.182 "nvme_io": false 00:19:55.182 }, 00:19:55.182 "memory_domains": [ 00:19:55.182 { 00:19:55.182 "dma_device_id": "system", 00:19:55.182 "dma_device_type": 1 00:19:55.182 }, 00:19:55.182 { 00:19:55.182 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.182 "dma_device_type": 2 00:19:55.182 } 00:19:55.182 ], 00:19:55.182 "driver_specific": {} 00:19:55.182 } 00:19:55.182 ] 00:19:55.182 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:55.182 15:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:55.182 15:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:55.182 15:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:19:55.441 BaseBdev3 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:55.441 15:59:00 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:55.699 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:19:55.958 [ 00:19:55.958 { 00:19:55.958 "name": "BaseBdev3", 00:19:55.958 "aliases": [ 00:19:55.958 "08e014ca-ca88-4f34-b226-e7c6516ba7a4" 00:19:55.958 ], 00:19:55.958 "product_name": "Malloc disk", 00:19:55.958 "block_size": 512, 00:19:55.958 "num_blocks": 65536, 00:19:55.958 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:19:55.958 "assigned_rate_limits": { 00:19:55.958 "rw_ios_per_sec": 0, 00:19:55.958 "rw_mbytes_per_sec": 0, 00:19:55.958 "r_mbytes_per_sec": 0, 00:19:55.958 "w_mbytes_per_sec": 0 00:19:55.958 }, 00:19:55.958 "claimed": false, 00:19:55.958 "zoned": false, 00:19:55.958 "supported_io_types": { 00:19:55.958 "read": true, 00:19:55.958 "write": true, 00:19:55.958 "unmap": true, 00:19:55.958 "write_zeroes": true, 00:19:55.958 "flush": true, 00:19:55.958 "reset": true, 00:19:55.958 "compare": false, 00:19:55.958 "compare_and_write": false, 00:19:55.958 "abort": true, 00:19:55.958 "nvme_admin": false, 00:19:55.958 "nvme_io": false 00:19:55.958 }, 00:19:55.958 "memory_domains": [ 00:19:55.958 { 00:19:55.958 "dma_device_id": "system", 00:19:55.958 "dma_device_type": 1 00:19:55.958 }, 00:19:55.958 { 00:19:55.958 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:55.958 "dma_device_type": 2 00:19:55.958 } 00:19:55.958 ], 00:19:55.958 "driver_specific": {} 00:19:55.958 } 00:19:55.958 ] 00:19:55.958 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:55.958 15:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:55.958 15:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:55.958 15:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:19:56.216 BaseBdev4 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:56.216 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:56.475 15:59:01 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:19:56.734 [ 00:19:56.734 { 00:19:56.734 "name": "BaseBdev4", 00:19:56.734 "aliases": [ 00:19:56.734 "fc545f3f-33ef-4fc6-a136-b623074c6ca8" 00:19:56.734 ], 00:19:56.734 "product_name": "Malloc disk", 00:19:56.734 "block_size": 512, 00:19:56.734 "num_blocks": 65536, 00:19:56.734 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:19:56.734 "assigned_rate_limits": { 00:19:56.734 "rw_ios_per_sec": 0, 00:19:56.734 "rw_mbytes_per_sec": 0, 00:19:56.734 "r_mbytes_per_sec": 0, 00:19:56.734 "w_mbytes_per_sec": 0 00:19:56.734 }, 00:19:56.734 "claimed": false, 00:19:56.734 "zoned": false, 00:19:56.734 "supported_io_types": { 00:19:56.734 "read": true, 00:19:56.734 "write": true, 00:19:56.734 "unmap": true, 00:19:56.734 "write_zeroes": true, 00:19:56.734 "flush": true, 00:19:56.734 "reset": true, 00:19:56.734 "compare": false, 00:19:56.734 "compare_and_write": false, 00:19:56.734 "abort": true, 00:19:56.734 "nvme_admin": false, 00:19:56.734 "nvme_io": false 00:19:56.734 }, 00:19:56.734 "memory_domains": [ 00:19:56.734 { 00:19:56.734 "dma_device_id": "system", 00:19:56.734 "dma_device_type": 1 00:19:56.734 }, 00:19:56.734 { 00:19:56.734 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:56.734 "dma_device_type": 2 00:19:56.734 } 00:19:56.734 ], 00:19:56.734 "driver_specific": {} 00:19:56.734 } 00:19:56.734 ] 00:19:56.734 15:59:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:56.734 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:19:56.734 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:19:56.734 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:19:56.993 [2024-06-10 15:59:02.286715] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:19:56.993 [2024-06-10 15:59:02.286753] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:19:56.993 [2024-06-10 15:59:02.286770] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:19:56.993 [2024-06-10 15:59:02.288188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:19:56.993 [2024-06-10 15:59:02.288231] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:56.993 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:57.252 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:57.252 "name": "Existed_Raid", 00:19:57.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.252 "strip_size_kb": 0, 00:19:57.252 "state": "configuring", 00:19:57.252 "raid_level": "raid1", 00:19:57.252 "superblock": false, 00:19:57.252 "num_base_bdevs": 4, 00:19:57.252 "num_base_bdevs_discovered": 3, 00:19:57.252 "num_base_bdevs_operational": 4, 00:19:57.252 "base_bdevs_list": [ 00:19:57.252 { 00:19:57.252 "name": "BaseBdev1", 00:19:57.252 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:57.252 "is_configured": false, 00:19:57.252 "data_offset": 0, 00:19:57.252 "data_size": 0 00:19:57.252 }, 00:19:57.252 { 00:19:57.252 "name": "BaseBdev2", 00:19:57.252 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:19:57.252 "is_configured": true, 00:19:57.252 "data_offset": 0, 00:19:57.252 "data_size": 65536 00:19:57.252 }, 00:19:57.252 { 00:19:57.252 "name": "BaseBdev3", 00:19:57.252 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:19:57.252 "is_configured": true, 00:19:57.252 "data_offset": 0, 00:19:57.252 "data_size": 65536 00:19:57.252 }, 00:19:57.252 { 00:19:57.252 "name": "BaseBdev4", 00:19:57.252 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:19:57.252 "is_configured": true, 00:19:57.252 "data_offset": 0, 00:19:57.252 "data_size": 65536 00:19:57.252 } 00:19:57.252 ] 00:19:57.252 }' 00:19:57.252 15:59:02 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:57.252 15:59:02 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:57.819 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:19:58.078 [2024-06-10 15:59:03.401669] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.078 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:19:58.336 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:19:58.336 "name": "Existed_Raid", 00:19:58.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.336 "strip_size_kb": 0, 00:19:58.336 "state": "configuring", 00:19:58.336 "raid_level": "raid1", 00:19:58.336 "superblock": false, 00:19:58.336 "num_base_bdevs": 4, 00:19:58.336 "num_base_bdevs_discovered": 2, 00:19:58.336 "num_base_bdevs_operational": 4, 00:19:58.336 "base_bdevs_list": [ 00:19:58.336 { 00:19:58.336 "name": "BaseBdev1", 00:19:58.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:19:58.336 "is_configured": false, 00:19:58.336 "data_offset": 0, 00:19:58.336 "data_size": 0 00:19:58.336 }, 00:19:58.336 { 00:19:58.336 "name": null, 00:19:58.336 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:19:58.336 "is_configured": false, 00:19:58.336 "data_offset": 0, 00:19:58.336 "data_size": 65536 00:19:58.336 }, 00:19:58.336 { 00:19:58.336 "name": "BaseBdev3", 00:19:58.336 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:19:58.336 "is_configured": true, 00:19:58.336 "data_offset": 0, 00:19:58.336 "data_size": 65536 00:19:58.336 }, 00:19:58.336 { 00:19:58.336 "name": "BaseBdev4", 00:19:58.336 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:19:58.336 "is_configured": true, 00:19:58.336 "data_offset": 0, 00:19:58.336 "data_size": 65536 00:19:58.336 } 00:19:58.336 ] 00:19:58.336 }' 00:19:58.336 15:59:03 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:19:58.336 15:59:03 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:19:58.903 15:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:58.903 15:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:19:59.161 15:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:19:59.161 15:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:19:59.419 [2024-06-10 15:59:04.800986] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:19:59.419 BaseBdev1 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:19:59.419 15:59:04 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:19:59.678 15:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:19:59.942 [ 00:19:59.942 { 00:19:59.942 "name": "BaseBdev1", 00:19:59.942 "aliases": [ 00:19:59.942 "d413264b-383f-420c-bdcb-cbbf20421cf5" 00:19:59.942 ], 00:19:59.942 "product_name": "Malloc disk", 00:19:59.942 "block_size": 512, 00:19:59.942 "num_blocks": 65536, 00:19:59.942 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:19:59.942 "assigned_rate_limits": { 00:19:59.942 "rw_ios_per_sec": 0, 00:19:59.942 "rw_mbytes_per_sec": 0, 00:19:59.942 "r_mbytes_per_sec": 0, 00:19:59.942 "w_mbytes_per_sec": 0 00:19:59.942 }, 00:19:59.942 "claimed": true, 00:19:59.942 "claim_type": "exclusive_write", 00:19:59.942 "zoned": false, 00:19:59.942 "supported_io_types": { 00:19:59.942 "read": true, 00:19:59.942 "write": true, 00:19:59.942 "unmap": true, 00:19:59.942 "write_zeroes": true, 00:19:59.942 "flush": true, 00:19:59.942 "reset": true, 00:19:59.942 "compare": false, 00:19:59.942 "compare_and_write": false, 00:19:59.942 "abort": true, 00:19:59.942 "nvme_admin": false, 00:19:59.942 "nvme_io": false 00:19:59.942 }, 00:19:59.942 "memory_domains": [ 00:19:59.942 { 00:19:59.942 "dma_device_id": "system", 00:19:59.942 "dma_device_type": 1 00:19:59.942 }, 00:19:59.942 { 00:19:59.942 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:19:59.942 "dma_device_type": 2 00:19:59.942 } 00:19:59.942 ], 00:19:59.942 "driver_specific": {} 00:19:59.942 } 00:19:59.943 ] 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:19:59.943 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:00.201 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:00.201 "name": "Existed_Raid", 00:20:00.201 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:00.201 "strip_size_kb": 0, 00:20:00.201 "state": "configuring", 00:20:00.201 "raid_level": "raid1", 00:20:00.201 "superblock": false, 00:20:00.201 "num_base_bdevs": 4, 00:20:00.201 "num_base_bdevs_discovered": 3, 00:20:00.201 "num_base_bdevs_operational": 4, 00:20:00.201 "base_bdevs_list": [ 00:20:00.201 { 00:20:00.201 "name": "BaseBdev1", 00:20:00.201 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:00.201 "is_configured": true, 00:20:00.201 "data_offset": 0, 00:20:00.201 "data_size": 65536 00:20:00.201 }, 00:20:00.201 { 00:20:00.201 "name": null, 00:20:00.201 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:00.201 "is_configured": false, 00:20:00.201 "data_offset": 0, 00:20:00.201 "data_size": 65536 00:20:00.201 }, 00:20:00.201 { 00:20:00.201 "name": "BaseBdev3", 00:20:00.201 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:00.201 "is_configured": true, 00:20:00.201 "data_offset": 0, 00:20:00.201 "data_size": 65536 00:20:00.201 }, 00:20:00.201 { 00:20:00.201 "name": "BaseBdev4", 00:20:00.201 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:00.201 "is_configured": true, 00:20:00.201 "data_offset": 0, 00:20:00.201 "data_size": 65536 00:20:00.201 } 00:20:00.201 ] 00:20:00.201 }' 00:20:00.201 15:59:05 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:00.201 15:59:05 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:00.768 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:00.768 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:01.027 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:01.027 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:01.285 [2024-06-10 15:59:06.589786] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:01.285 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:01.543 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:01.544 "name": "Existed_Raid", 00:20:01.544 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:01.544 "strip_size_kb": 0, 00:20:01.544 "state": "configuring", 00:20:01.544 "raid_level": "raid1", 00:20:01.544 "superblock": false, 00:20:01.544 "num_base_bdevs": 4, 00:20:01.544 "num_base_bdevs_discovered": 2, 00:20:01.544 "num_base_bdevs_operational": 4, 00:20:01.544 "base_bdevs_list": [ 00:20:01.544 { 00:20:01.544 "name": "BaseBdev1", 00:20:01.544 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:01.544 "is_configured": true, 00:20:01.544 "data_offset": 0, 00:20:01.544 "data_size": 65536 00:20:01.544 }, 00:20:01.544 { 00:20:01.544 "name": null, 00:20:01.544 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:01.544 "is_configured": false, 00:20:01.544 "data_offset": 0, 00:20:01.544 "data_size": 65536 00:20:01.544 }, 00:20:01.544 { 00:20:01.544 "name": null, 00:20:01.544 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:01.544 "is_configured": false, 00:20:01.544 "data_offset": 0, 00:20:01.544 "data_size": 65536 00:20:01.544 }, 00:20:01.544 { 00:20:01.544 "name": "BaseBdev4", 00:20:01.544 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:01.544 "is_configured": true, 00:20:01.544 "data_offset": 0, 00:20:01.544 "data_size": 65536 00:20:01.544 } 00:20:01.544 ] 00:20:01.544 }' 00:20:01.544 15:59:06 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:01.544 15:59:06 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:02.111 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.111 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:02.369 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:02.369 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:02.628 [2024-06-10 15:59:07.961471] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:02.628 15:59:07 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:02.887 15:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:02.887 "name": "Existed_Raid", 00:20:02.887 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:02.887 "strip_size_kb": 0, 00:20:02.887 "state": "configuring", 00:20:02.887 "raid_level": "raid1", 00:20:02.887 "superblock": false, 00:20:02.887 "num_base_bdevs": 4, 00:20:02.887 "num_base_bdevs_discovered": 3, 00:20:02.887 "num_base_bdevs_operational": 4, 00:20:02.887 "base_bdevs_list": [ 00:20:02.887 { 00:20:02.887 "name": "BaseBdev1", 00:20:02.887 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:02.887 "is_configured": true, 00:20:02.887 "data_offset": 0, 00:20:02.887 "data_size": 65536 00:20:02.887 }, 00:20:02.887 { 00:20:02.887 "name": null, 00:20:02.887 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:02.887 "is_configured": false, 00:20:02.887 "data_offset": 0, 00:20:02.887 "data_size": 65536 00:20:02.887 }, 00:20:02.887 { 00:20:02.887 "name": "BaseBdev3", 00:20:02.887 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:02.887 "is_configured": true, 00:20:02.887 "data_offset": 0, 00:20:02.887 "data_size": 65536 00:20:02.887 }, 00:20:02.887 { 00:20:02.887 "name": "BaseBdev4", 00:20:02.887 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:02.887 "is_configured": true, 00:20:02.887 "data_offset": 0, 00:20:02.887 "data_size": 65536 00:20:02.887 } 00:20:02.887 ] 00:20:02.887 }' 00:20:02.887 15:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:02.887 15:59:08 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:03.456 15:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.456 15:59:08 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:03.714 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:03.714 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:03.973 [2024-06-10 15:59:09.369271] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:03.973 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:04.232 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:04.232 "name": "Existed_Raid", 00:20:04.232 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:04.232 "strip_size_kb": 0, 00:20:04.232 "state": "configuring", 00:20:04.232 "raid_level": "raid1", 00:20:04.232 "superblock": false, 00:20:04.232 "num_base_bdevs": 4, 00:20:04.232 "num_base_bdevs_discovered": 2, 00:20:04.232 "num_base_bdevs_operational": 4, 00:20:04.232 "base_bdevs_list": [ 00:20:04.232 { 00:20:04.232 "name": null, 00:20:04.232 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:04.232 "is_configured": false, 00:20:04.232 "data_offset": 0, 00:20:04.232 "data_size": 65536 00:20:04.232 }, 00:20:04.232 { 00:20:04.232 "name": null, 00:20:04.232 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:04.232 "is_configured": false, 00:20:04.232 "data_offset": 0, 00:20:04.232 "data_size": 65536 00:20:04.232 }, 00:20:04.232 { 00:20:04.232 "name": "BaseBdev3", 00:20:04.232 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:04.232 "is_configured": true, 00:20:04.232 "data_offset": 0, 00:20:04.232 "data_size": 65536 00:20:04.232 }, 00:20:04.232 { 00:20:04.232 "name": "BaseBdev4", 00:20:04.232 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:04.232 "is_configured": true, 00:20:04.232 "data_offset": 0, 00:20:04.232 "data_size": 65536 00:20:04.232 } 00:20:04.232 ] 00:20:04.232 }' 00:20:04.232 15:59:09 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:04.232 15:59:09 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:04.799 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:04.799 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:05.058 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:05.058 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:05.317 [2024-06-10 15:59:10.767427] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:05.317 15:59:10 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:05.576 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:05.576 "name": "Existed_Raid", 00:20:05.576 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:05.576 "strip_size_kb": 0, 00:20:05.576 "state": "configuring", 00:20:05.576 "raid_level": "raid1", 00:20:05.576 "superblock": false, 00:20:05.576 "num_base_bdevs": 4, 00:20:05.576 "num_base_bdevs_discovered": 3, 00:20:05.576 "num_base_bdevs_operational": 4, 00:20:05.576 "base_bdevs_list": [ 00:20:05.576 { 00:20:05.576 "name": null, 00:20:05.576 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:05.576 "is_configured": false, 00:20:05.576 "data_offset": 0, 00:20:05.576 "data_size": 65536 00:20:05.576 }, 00:20:05.576 { 00:20:05.576 "name": "BaseBdev2", 00:20:05.576 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:05.576 "is_configured": true, 00:20:05.576 "data_offset": 0, 00:20:05.576 "data_size": 65536 00:20:05.576 }, 00:20:05.576 { 00:20:05.576 "name": "BaseBdev3", 00:20:05.576 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:05.576 "is_configured": true, 00:20:05.576 "data_offset": 0, 00:20:05.576 "data_size": 65536 00:20:05.576 }, 00:20:05.576 { 00:20:05.576 "name": "BaseBdev4", 00:20:05.576 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:05.576 "is_configured": true, 00:20:05.576 "data_offset": 0, 00:20:05.576 "data_size": 65536 00:20:05.576 } 00:20:05.576 ] 00:20:05.576 }' 00:20:05.576 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:05.576 15:59:11 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:06.514 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.514 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:06.514 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:06.514 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:06.514 15:59:11 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:06.774 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u d413264b-383f-420c-bdcb-cbbf20421cf5 00:20:07.033 [2024-06-10 15:59:12.431050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:07.033 [2024-06-10 15:59:12.431089] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x2360960 00:20:07.033 [2024-06-10 15:59:12.431096] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:20:07.033 [2024-06-10 15:59:12.431295] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x23656a0 00:20:07.033 [2024-06-10 15:59:12.431425] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x2360960 00:20:07.033 [2024-06-10 15:59:12.431433] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x2360960 00:20:07.033 [2024-06-10 15:59:12.431589] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:07.033 NewBaseBdev 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@900 -- # local i 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:07.033 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:07.292 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:07.583 [ 00:20:07.583 { 00:20:07.583 "name": "NewBaseBdev", 00:20:07.583 "aliases": [ 00:20:07.583 "d413264b-383f-420c-bdcb-cbbf20421cf5" 00:20:07.583 ], 00:20:07.583 "product_name": "Malloc disk", 00:20:07.583 "block_size": 512, 00:20:07.583 "num_blocks": 65536, 00:20:07.583 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:07.583 "assigned_rate_limits": { 00:20:07.583 "rw_ios_per_sec": 0, 00:20:07.583 "rw_mbytes_per_sec": 0, 00:20:07.583 "r_mbytes_per_sec": 0, 00:20:07.583 "w_mbytes_per_sec": 0 00:20:07.583 }, 00:20:07.583 "claimed": true, 00:20:07.583 "claim_type": "exclusive_write", 00:20:07.583 "zoned": false, 00:20:07.583 "supported_io_types": { 00:20:07.583 "read": true, 00:20:07.583 "write": true, 00:20:07.583 "unmap": true, 00:20:07.583 "write_zeroes": true, 00:20:07.583 "flush": true, 00:20:07.583 "reset": true, 00:20:07.583 "compare": false, 00:20:07.583 "compare_and_write": false, 00:20:07.583 "abort": true, 00:20:07.583 "nvme_admin": false, 00:20:07.583 "nvme_io": false 00:20:07.583 }, 00:20:07.583 "memory_domains": [ 00:20:07.583 { 00:20:07.583 "dma_device_id": "system", 00:20:07.583 "dma_device_type": 1 00:20:07.583 }, 00:20:07.583 { 00:20:07.583 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:07.583 "dma_device_type": 2 00:20:07.583 } 00:20:07.583 ], 00:20:07.583 "driver_specific": {} 00:20:07.583 } 00:20:07.583 ] 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- common/autotest_common.sh@906 -- # return 0 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:07.583 15:59:12 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:07.843 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:07.843 "name": "Existed_Raid", 00:20:07.843 "uuid": "c6f5216c-9799-425d-9a0f-76243703d83f", 00:20:07.843 "strip_size_kb": 0, 00:20:07.843 "state": "online", 00:20:07.843 "raid_level": "raid1", 00:20:07.843 "superblock": false, 00:20:07.843 "num_base_bdevs": 4, 00:20:07.843 "num_base_bdevs_discovered": 4, 00:20:07.843 "num_base_bdevs_operational": 4, 00:20:07.843 "base_bdevs_list": [ 00:20:07.843 { 00:20:07.843 "name": "NewBaseBdev", 00:20:07.843 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:07.843 "is_configured": true, 00:20:07.843 "data_offset": 0, 00:20:07.843 "data_size": 65536 00:20:07.843 }, 00:20:07.843 { 00:20:07.843 "name": "BaseBdev2", 00:20:07.843 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:07.843 "is_configured": true, 00:20:07.843 "data_offset": 0, 00:20:07.843 "data_size": 65536 00:20:07.843 }, 00:20:07.843 { 00:20:07.843 "name": "BaseBdev3", 00:20:07.843 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:07.843 "is_configured": true, 00:20:07.843 "data_offset": 0, 00:20:07.843 "data_size": 65536 00:20:07.843 }, 00:20:07.843 { 00:20:07.843 "name": "BaseBdev4", 00:20:07.843 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:07.843 "is_configured": true, 00:20:07.843 "data_offset": 0, 00:20:07.843 "data_size": 65536 00:20:07.843 } 00:20:07.843 ] 00:20:07.843 }' 00:20:07.843 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:07.843 15:59:13 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:08.410 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:08.411 15:59:13 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:08.669 [2024-06-10 15:59:14.083807] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:08.670 "name": "Existed_Raid", 00:20:08.670 "aliases": [ 00:20:08.670 "c6f5216c-9799-425d-9a0f-76243703d83f" 00:20:08.670 ], 00:20:08.670 "product_name": "Raid Volume", 00:20:08.670 "block_size": 512, 00:20:08.670 "num_blocks": 65536, 00:20:08.670 "uuid": "c6f5216c-9799-425d-9a0f-76243703d83f", 00:20:08.670 "assigned_rate_limits": { 00:20:08.670 "rw_ios_per_sec": 0, 00:20:08.670 "rw_mbytes_per_sec": 0, 00:20:08.670 "r_mbytes_per_sec": 0, 00:20:08.670 "w_mbytes_per_sec": 0 00:20:08.670 }, 00:20:08.670 "claimed": false, 00:20:08.670 "zoned": false, 00:20:08.670 "supported_io_types": { 00:20:08.670 "read": true, 00:20:08.670 "write": true, 00:20:08.670 "unmap": false, 00:20:08.670 "write_zeroes": true, 00:20:08.670 "flush": false, 00:20:08.670 "reset": true, 00:20:08.670 "compare": false, 00:20:08.670 "compare_and_write": false, 00:20:08.670 "abort": false, 00:20:08.670 "nvme_admin": false, 00:20:08.670 "nvme_io": false 00:20:08.670 }, 00:20:08.670 "memory_domains": [ 00:20:08.670 { 00:20:08.670 "dma_device_id": "system", 00:20:08.670 "dma_device_type": 1 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.670 "dma_device_type": 2 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "system", 00:20:08.670 "dma_device_type": 1 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.670 "dma_device_type": 2 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "system", 00:20:08.670 "dma_device_type": 1 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.670 "dma_device_type": 2 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "system", 00:20:08.670 "dma_device_type": 1 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.670 "dma_device_type": 2 00:20:08.670 } 00:20:08.670 ], 00:20:08.670 "driver_specific": { 00:20:08.670 "raid": { 00:20:08.670 "uuid": "c6f5216c-9799-425d-9a0f-76243703d83f", 00:20:08.670 "strip_size_kb": 0, 00:20:08.670 "state": "online", 00:20:08.670 "raid_level": "raid1", 00:20:08.670 "superblock": false, 00:20:08.670 "num_base_bdevs": 4, 00:20:08.670 "num_base_bdevs_discovered": 4, 00:20:08.670 "num_base_bdevs_operational": 4, 00:20:08.670 "base_bdevs_list": [ 00:20:08.670 { 00:20:08.670 "name": "NewBaseBdev", 00:20:08.670 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:08.670 "is_configured": true, 00:20:08.670 "data_offset": 0, 00:20:08.670 "data_size": 65536 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "name": "BaseBdev2", 00:20:08.670 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:08.670 "is_configured": true, 00:20:08.670 "data_offset": 0, 00:20:08.670 "data_size": 65536 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "name": "BaseBdev3", 00:20:08.670 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:08.670 "is_configured": true, 00:20:08.670 "data_offset": 0, 00:20:08.670 "data_size": 65536 00:20:08.670 }, 00:20:08.670 { 00:20:08.670 "name": "BaseBdev4", 00:20:08.670 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:08.670 "is_configured": true, 00:20:08.670 "data_offset": 0, 00:20:08.670 "data_size": 65536 00:20:08.670 } 00:20:08.670 ] 00:20:08.670 } 00:20:08.670 } 00:20:08.670 }' 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:08.670 BaseBdev2 00:20:08.670 BaseBdev3 00:20:08.670 BaseBdev4' 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:08.670 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:08.929 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:08.929 "name": "NewBaseBdev", 00:20:08.929 "aliases": [ 00:20:08.929 "d413264b-383f-420c-bdcb-cbbf20421cf5" 00:20:08.929 ], 00:20:08.929 "product_name": "Malloc disk", 00:20:08.929 "block_size": 512, 00:20:08.929 "num_blocks": 65536, 00:20:08.929 "uuid": "d413264b-383f-420c-bdcb-cbbf20421cf5", 00:20:08.929 "assigned_rate_limits": { 00:20:08.929 "rw_ios_per_sec": 0, 00:20:08.929 "rw_mbytes_per_sec": 0, 00:20:08.929 "r_mbytes_per_sec": 0, 00:20:08.929 "w_mbytes_per_sec": 0 00:20:08.929 }, 00:20:08.929 "claimed": true, 00:20:08.929 "claim_type": "exclusive_write", 00:20:08.929 "zoned": false, 00:20:08.929 "supported_io_types": { 00:20:08.929 "read": true, 00:20:08.929 "write": true, 00:20:08.929 "unmap": true, 00:20:08.929 "write_zeroes": true, 00:20:08.929 "flush": true, 00:20:08.929 "reset": true, 00:20:08.929 "compare": false, 00:20:08.929 "compare_and_write": false, 00:20:08.929 "abort": true, 00:20:08.929 "nvme_admin": false, 00:20:08.929 "nvme_io": false 00:20:08.929 }, 00:20:08.929 "memory_domains": [ 00:20:08.929 { 00:20:08.929 "dma_device_id": "system", 00:20:08.929 "dma_device_type": 1 00:20:08.929 }, 00:20:08.929 { 00:20:08.929 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:08.929 "dma_device_type": 2 00:20:08.929 } 00:20:08.929 ], 00:20:08.929 "driver_specific": {} 00:20:08.929 }' 00:20:08.929 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.188 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:09.447 15:59:14 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:09.706 "name": "BaseBdev2", 00:20:09.706 "aliases": [ 00:20:09.706 "0da96fb9-4995-4527-8dba-76716c67daf3" 00:20:09.706 ], 00:20:09.706 "product_name": "Malloc disk", 00:20:09.706 "block_size": 512, 00:20:09.706 "num_blocks": 65536, 00:20:09.706 "uuid": "0da96fb9-4995-4527-8dba-76716c67daf3", 00:20:09.706 "assigned_rate_limits": { 00:20:09.706 "rw_ios_per_sec": 0, 00:20:09.706 "rw_mbytes_per_sec": 0, 00:20:09.706 "r_mbytes_per_sec": 0, 00:20:09.706 "w_mbytes_per_sec": 0 00:20:09.706 }, 00:20:09.706 "claimed": true, 00:20:09.706 "claim_type": "exclusive_write", 00:20:09.706 "zoned": false, 00:20:09.706 "supported_io_types": { 00:20:09.706 "read": true, 00:20:09.706 "write": true, 00:20:09.706 "unmap": true, 00:20:09.706 "write_zeroes": true, 00:20:09.706 "flush": true, 00:20:09.706 "reset": true, 00:20:09.706 "compare": false, 00:20:09.706 "compare_and_write": false, 00:20:09.706 "abort": true, 00:20:09.706 "nvme_admin": false, 00:20:09.706 "nvme_io": false 00:20:09.706 }, 00:20:09.706 "memory_domains": [ 00:20:09.706 { 00:20:09.706 "dma_device_id": "system", 00:20:09.706 "dma_device_type": 1 00:20:09.706 }, 00:20:09.706 { 00:20:09.706 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:09.706 "dma_device_type": 2 00:20:09.706 } 00:20:09.706 ], 00:20:09.706 "driver_specific": {} 00:20:09.706 }' 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.706 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:09.965 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.223 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.223 "name": "BaseBdev3", 00:20:10.223 "aliases": [ 00:20:10.223 "08e014ca-ca88-4f34-b226-e7c6516ba7a4" 00:20:10.223 ], 00:20:10.223 "product_name": "Malloc disk", 00:20:10.223 "block_size": 512, 00:20:10.223 "num_blocks": 65536, 00:20:10.223 "uuid": "08e014ca-ca88-4f34-b226-e7c6516ba7a4", 00:20:10.223 "assigned_rate_limits": { 00:20:10.223 "rw_ios_per_sec": 0, 00:20:10.223 "rw_mbytes_per_sec": 0, 00:20:10.223 "r_mbytes_per_sec": 0, 00:20:10.223 "w_mbytes_per_sec": 0 00:20:10.223 }, 00:20:10.223 "claimed": true, 00:20:10.223 "claim_type": "exclusive_write", 00:20:10.223 "zoned": false, 00:20:10.223 "supported_io_types": { 00:20:10.223 "read": true, 00:20:10.223 "write": true, 00:20:10.223 "unmap": true, 00:20:10.223 "write_zeroes": true, 00:20:10.223 "flush": true, 00:20:10.223 "reset": true, 00:20:10.223 "compare": false, 00:20:10.223 "compare_and_write": false, 00:20:10.223 "abort": true, 00:20:10.223 "nvme_admin": false, 00:20:10.223 "nvme_io": false 00:20:10.223 }, 00:20:10.223 "memory_domains": [ 00:20:10.223 { 00:20:10.223 "dma_device_id": "system", 00:20:10.223 "dma_device_type": 1 00:20:10.223 }, 00:20:10.223 { 00:20:10.223 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.223 "dma_device_type": 2 00:20:10.223 } 00:20:10.223 ], 00:20:10.223 "driver_specific": {} 00:20:10.223 }' 00:20:10.223 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.223 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.482 15:59:15 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:10.740 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:10.740 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:10.740 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:10.740 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:10.999 "name": "BaseBdev4", 00:20:10.999 "aliases": [ 00:20:10.999 "fc545f3f-33ef-4fc6-a136-b623074c6ca8" 00:20:10.999 ], 00:20:10.999 "product_name": "Malloc disk", 00:20:10.999 "block_size": 512, 00:20:10.999 "num_blocks": 65536, 00:20:10.999 "uuid": "fc545f3f-33ef-4fc6-a136-b623074c6ca8", 00:20:10.999 "assigned_rate_limits": { 00:20:10.999 "rw_ios_per_sec": 0, 00:20:10.999 "rw_mbytes_per_sec": 0, 00:20:10.999 "r_mbytes_per_sec": 0, 00:20:10.999 "w_mbytes_per_sec": 0 00:20:10.999 }, 00:20:10.999 "claimed": true, 00:20:10.999 "claim_type": "exclusive_write", 00:20:10.999 "zoned": false, 00:20:10.999 "supported_io_types": { 00:20:10.999 "read": true, 00:20:10.999 "write": true, 00:20:10.999 "unmap": true, 00:20:10.999 "write_zeroes": true, 00:20:10.999 "flush": true, 00:20:10.999 "reset": true, 00:20:10.999 "compare": false, 00:20:10.999 "compare_and_write": false, 00:20:10.999 "abort": true, 00:20:10.999 "nvme_admin": false, 00:20:10.999 "nvme_io": false 00:20:10.999 }, 00:20:10.999 "memory_domains": [ 00:20:10.999 { 00:20:10.999 "dma_device_id": "system", 00:20:10.999 "dma_device_type": 1 00:20:10.999 }, 00:20:10.999 { 00:20:10.999 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:10.999 "dma_device_type": 2 00:20:10.999 } 00:20:10.999 ], 00:20:10.999 "driver_specific": {} 00:20:10.999 }' 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:10.999 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:11.258 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:11.517 [2024-06-10 15:59:16.891028] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:11.517 [2024-06-10 15:59:16.891053] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:11.517 [2024-06-10 15:59:16.891105] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:11.517 [2024-06-10 15:59:16.891384] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:11.517 [2024-06-10 15:59:16.891393] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x2360960 name Existed_Raid, state offline 00:20:11.517 15:59:16 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@341 -- # killprocess 2742713 00:20:11.517 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@949 -- # '[' -z 2742713 ']' 00:20:11.517 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@953 -- # kill -0 2742713 00:20:11.517 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # uname 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2742713 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2742713' 00:20:11.518 killing process with pid 2742713 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@968 -- # kill 2742713 00:20:11.518 [2024-06-10 15:59:16.957602] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:11.518 15:59:16 bdev_raid.raid_state_function_test -- common/autotest_common.sh@973 -- # wait 2742713 00:20:11.518 [2024-06-10 15:59:16.991559] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test -- bdev/bdev_raid.sh@343 -- # return 0 00:20:11.777 00:20:11.777 real 0m33.721s 00:20:11.777 user 1m3.224s 00:20:11.777 sys 0m4.770s 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test -- common/autotest_common.sh@10 -- # set +x 00:20:11.777 ************************************ 00:20:11.777 END TEST raid_state_function_test 00:20:11.777 ************************************ 00:20:11.777 15:59:17 bdev_raid -- bdev/bdev_raid.sh@868 -- # run_test raid_state_function_test_sb raid_state_function_test raid1 4 true 00:20:11.777 15:59:17 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:20:11.777 15:59:17 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:11.777 15:59:17 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:11.777 ************************************ 00:20:11.777 START TEST raid_state_function_test_sb 00:20:11.777 ************************************ 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 4 true 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=4 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev3 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.777 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # echo BaseBdev4 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@226 -- # local strip_size 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@244 -- # raid_pid=2748934 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2748934' 00:20:11.778 Process raid pid: 2748934 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@246 -- # waitforlisten 2748934 /var/tmp/spdk-raid.sock 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2748934 ']' 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:11.778 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:11.778 15:59:17 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:12.036 [2024-06-10 15:59:17.329157] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:20:12.036 [2024-06-10 15:59:17.329210] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:20:12.036 [2024-06-10 15:59:17.427158] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:12.036 [2024-06-10 15:59:17.521455] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:12.295 [2024-06-10 15:59:17.580292] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:12.295 [2024-06-10 15:59:17.580322] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:12.862 15:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:12.862 15:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@863 -- # return 0 00:20:12.862 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:13.122 [2024-06-10 15:59:18.519259] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:13.122 [2024-06-10 15:59:18.519295] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:13.122 [2024-06-10 15:59:18.519304] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:13.122 [2024-06-10 15:59:18.519313] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:13.122 [2024-06-10 15:59:18.519320] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:13.122 [2024-06-10 15:59:18.519328] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:13.122 [2024-06-10 15:59:18.519335] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:13.122 [2024-06-10 15:59:18.519342] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:13.122 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:13.381 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:13.381 "name": "Existed_Raid", 00:20:13.381 "uuid": "af404af2-c973-42b7-bf19-7029b7cd4bf8", 00:20:13.381 "strip_size_kb": 0, 00:20:13.381 "state": "configuring", 00:20:13.381 "raid_level": "raid1", 00:20:13.381 "superblock": true, 00:20:13.381 "num_base_bdevs": 4, 00:20:13.381 "num_base_bdevs_discovered": 0, 00:20:13.381 "num_base_bdevs_operational": 4, 00:20:13.381 "base_bdevs_list": [ 00:20:13.381 { 00:20:13.381 "name": "BaseBdev1", 00:20:13.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.381 "is_configured": false, 00:20:13.381 "data_offset": 0, 00:20:13.381 "data_size": 0 00:20:13.381 }, 00:20:13.381 { 00:20:13.381 "name": "BaseBdev2", 00:20:13.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.381 "is_configured": false, 00:20:13.381 "data_offset": 0, 00:20:13.381 "data_size": 0 00:20:13.381 }, 00:20:13.381 { 00:20:13.381 "name": "BaseBdev3", 00:20:13.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.381 "is_configured": false, 00:20:13.381 "data_offset": 0, 00:20:13.381 "data_size": 0 00:20:13.381 }, 00:20:13.381 { 00:20:13.381 "name": "BaseBdev4", 00:20:13.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:13.381 "is_configured": false, 00:20:13.381 "data_offset": 0, 00:20:13.381 "data_size": 0 00:20:13.381 } 00:20:13.381 ] 00:20:13.381 }' 00:20:13.381 15:59:18 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:13.381 15:59:18 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:13.977 15:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:14.236 [2024-06-10 15:59:19.622067] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:14.236 [2024-06-10 15:59:19.622096] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e1140 name Existed_Raid, state configuring 00:20:14.236 15:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:14.495 [2024-06-10 15:59:19.878778] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:14.495 [2024-06-10 15:59:19.878809] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:14.496 [2024-06-10 15:59:19.878817] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:14.496 [2024-06-10 15:59:19.878826] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:14.496 [2024-06-10 15:59:19.878832] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:14.496 [2024-06-10 15:59:19.878840] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:14.496 [2024-06-10 15:59:19.878847] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:14.496 [2024-06-10 15:59:19.878855] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:14.496 15:59:19 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:14.755 [2024-06-10 15:59:20.153053] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:14.755 BaseBdev1 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:14.755 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:15.013 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:15.272 [ 00:20:15.272 { 00:20:15.272 "name": "BaseBdev1", 00:20:15.272 "aliases": [ 00:20:15.272 "1aa1873b-22e9-4a94-90ee-334cbcf703e2" 00:20:15.272 ], 00:20:15.272 "product_name": "Malloc disk", 00:20:15.272 "block_size": 512, 00:20:15.272 "num_blocks": 65536, 00:20:15.272 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:15.272 "assigned_rate_limits": { 00:20:15.272 "rw_ios_per_sec": 0, 00:20:15.272 "rw_mbytes_per_sec": 0, 00:20:15.272 "r_mbytes_per_sec": 0, 00:20:15.272 "w_mbytes_per_sec": 0 00:20:15.272 }, 00:20:15.272 "claimed": true, 00:20:15.272 "claim_type": "exclusive_write", 00:20:15.272 "zoned": false, 00:20:15.272 "supported_io_types": { 00:20:15.272 "read": true, 00:20:15.272 "write": true, 00:20:15.272 "unmap": true, 00:20:15.272 "write_zeroes": true, 00:20:15.272 "flush": true, 00:20:15.272 "reset": true, 00:20:15.272 "compare": false, 00:20:15.272 "compare_and_write": false, 00:20:15.272 "abort": true, 00:20:15.272 "nvme_admin": false, 00:20:15.272 "nvme_io": false 00:20:15.272 }, 00:20:15.272 "memory_domains": [ 00:20:15.272 { 00:20:15.272 "dma_device_id": "system", 00:20:15.272 "dma_device_type": 1 00:20:15.272 }, 00:20:15.272 { 00:20:15.272 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:15.272 "dma_device_type": 2 00:20:15.272 } 00:20:15.272 ], 00:20:15.273 "driver_specific": {} 00:20:15.273 } 00:20:15.273 ] 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:15.273 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:15.532 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:15.532 "name": "Existed_Raid", 00:20:15.532 "uuid": "b241e640-3d3d-4abc-a5b0-08637e59c1bb", 00:20:15.532 "strip_size_kb": 0, 00:20:15.532 "state": "configuring", 00:20:15.532 "raid_level": "raid1", 00:20:15.532 "superblock": true, 00:20:15.532 "num_base_bdevs": 4, 00:20:15.532 "num_base_bdevs_discovered": 1, 00:20:15.532 "num_base_bdevs_operational": 4, 00:20:15.532 "base_bdevs_list": [ 00:20:15.532 { 00:20:15.532 "name": "BaseBdev1", 00:20:15.532 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:15.532 "is_configured": true, 00:20:15.532 "data_offset": 2048, 00:20:15.532 "data_size": 63488 00:20:15.532 }, 00:20:15.532 { 00:20:15.532 "name": "BaseBdev2", 00:20:15.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.532 "is_configured": false, 00:20:15.532 "data_offset": 0, 00:20:15.532 "data_size": 0 00:20:15.532 }, 00:20:15.532 { 00:20:15.532 "name": "BaseBdev3", 00:20:15.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.532 "is_configured": false, 00:20:15.532 "data_offset": 0, 00:20:15.532 "data_size": 0 00:20:15.532 }, 00:20:15.532 { 00:20:15.532 "name": "BaseBdev4", 00:20:15.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:15.532 "is_configured": false, 00:20:15.532 "data_offset": 0, 00:20:15.532 "data_size": 0 00:20:15.532 } 00:20:15.532 ] 00:20:15.532 }' 00:20:15.532 15:59:20 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:15.532 15:59:20 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:16.099 15:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:16.358 [2024-06-10 15:59:21.805475] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:16.358 [2024-06-10 15:59:21.805515] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e09b0 name Existed_Raid, state configuring 00:20:16.358 15:59:21 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:16.617 [2024-06-10 15:59:22.058187] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:16.617 [2024-06-10 15:59:22.059717] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:20:16.617 [2024-06-10 15:59:22.059748] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:20:16.617 [2024-06-10 15:59:22.059756] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev3 00:20:16.617 [2024-06-10 15:59:22.059765] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev3 doesn't exist now 00:20:16.617 [2024-06-10 15:59:22.059776] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev4 00:20:16.617 [2024-06-10 15:59:22.059785] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev4 doesn't exist now 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:16.617 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:16.618 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:16.618 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:16.618 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:16.618 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:16.876 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:16.876 "name": "Existed_Raid", 00:20:16.876 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:16.876 "strip_size_kb": 0, 00:20:16.876 "state": "configuring", 00:20:16.876 "raid_level": "raid1", 00:20:16.876 "superblock": true, 00:20:16.876 "num_base_bdevs": 4, 00:20:16.876 "num_base_bdevs_discovered": 1, 00:20:16.876 "num_base_bdevs_operational": 4, 00:20:16.876 "base_bdevs_list": [ 00:20:16.876 { 00:20:16.876 "name": "BaseBdev1", 00:20:16.876 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:16.876 "is_configured": true, 00:20:16.876 "data_offset": 2048, 00:20:16.876 "data_size": 63488 00:20:16.876 }, 00:20:16.876 { 00:20:16.876 "name": "BaseBdev2", 00:20:16.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.876 "is_configured": false, 00:20:16.876 "data_offset": 0, 00:20:16.876 "data_size": 0 00:20:16.876 }, 00:20:16.876 { 00:20:16.876 "name": "BaseBdev3", 00:20:16.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.876 "is_configured": false, 00:20:16.876 "data_offset": 0, 00:20:16.876 "data_size": 0 00:20:16.876 }, 00:20:16.876 { 00:20:16.876 "name": "BaseBdev4", 00:20:16.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:16.876 "is_configured": false, 00:20:16.876 "data_offset": 0, 00:20:16.876 "data_size": 0 00:20:16.876 } 00:20:16.876 ] 00:20:16.876 }' 00:20:16.876 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:16.876 15:59:22 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:17.811 15:59:22 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:17.811 [2024-06-10 15:59:23.204429] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:17.811 BaseBdev2 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:17.811 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:18.071 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:18.330 [ 00:20:18.330 { 00:20:18.330 "name": "BaseBdev2", 00:20:18.330 "aliases": [ 00:20:18.330 "92fe35a0-ee37-4d52-9450-8daef206c33b" 00:20:18.330 ], 00:20:18.330 "product_name": "Malloc disk", 00:20:18.330 "block_size": 512, 00:20:18.330 "num_blocks": 65536, 00:20:18.330 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:18.330 "assigned_rate_limits": { 00:20:18.330 "rw_ios_per_sec": 0, 00:20:18.330 "rw_mbytes_per_sec": 0, 00:20:18.330 "r_mbytes_per_sec": 0, 00:20:18.330 "w_mbytes_per_sec": 0 00:20:18.330 }, 00:20:18.330 "claimed": true, 00:20:18.330 "claim_type": "exclusive_write", 00:20:18.330 "zoned": false, 00:20:18.330 "supported_io_types": { 00:20:18.330 "read": true, 00:20:18.330 "write": true, 00:20:18.330 "unmap": true, 00:20:18.330 "write_zeroes": true, 00:20:18.330 "flush": true, 00:20:18.330 "reset": true, 00:20:18.330 "compare": false, 00:20:18.330 "compare_and_write": false, 00:20:18.330 "abort": true, 00:20:18.330 "nvme_admin": false, 00:20:18.330 "nvme_io": false 00:20:18.330 }, 00:20:18.330 "memory_domains": [ 00:20:18.330 { 00:20:18.330 "dma_device_id": "system", 00:20:18.330 "dma_device_type": 1 00:20:18.330 }, 00:20:18.330 { 00:20:18.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:18.330 "dma_device_type": 2 00:20:18.330 } 00:20:18.330 ], 00:20:18.330 "driver_specific": {} 00:20:18.330 } 00:20:18.330 ] 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:18.330 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:18.589 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:18.589 "name": "Existed_Raid", 00:20:18.589 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:18.589 "strip_size_kb": 0, 00:20:18.589 "state": "configuring", 00:20:18.589 "raid_level": "raid1", 00:20:18.589 "superblock": true, 00:20:18.589 "num_base_bdevs": 4, 00:20:18.589 "num_base_bdevs_discovered": 2, 00:20:18.589 "num_base_bdevs_operational": 4, 00:20:18.589 "base_bdevs_list": [ 00:20:18.589 { 00:20:18.589 "name": "BaseBdev1", 00:20:18.589 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:18.589 "is_configured": true, 00:20:18.589 "data_offset": 2048, 00:20:18.589 "data_size": 63488 00:20:18.589 }, 00:20:18.589 { 00:20:18.589 "name": "BaseBdev2", 00:20:18.589 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:18.589 "is_configured": true, 00:20:18.589 "data_offset": 2048, 00:20:18.589 "data_size": 63488 00:20:18.589 }, 00:20:18.589 { 00:20:18.589 "name": "BaseBdev3", 00:20:18.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.589 "is_configured": false, 00:20:18.589 "data_offset": 0, 00:20:18.589 "data_size": 0 00:20:18.589 }, 00:20:18.589 { 00:20:18.589 "name": "BaseBdev4", 00:20:18.589 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:18.589 "is_configured": false, 00:20:18.589 "data_offset": 0, 00:20:18.589 "data_size": 0 00:20:18.589 } 00:20:18.589 ] 00:20:18.589 }' 00:20:18.589 15:59:23 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:18.589 15:59:23 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:19.157 15:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:19.417 [2024-06-10 15:59:24.868263] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:19.417 BaseBdev3 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev3 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:19.417 15:59:24 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:19.676 15:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:19.935 [ 00:20:19.935 { 00:20:19.935 "name": "BaseBdev3", 00:20:19.935 "aliases": [ 00:20:19.935 "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b" 00:20:19.935 ], 00:20:19.935 "product_name": "Malloc disk", 00:20:19.935 "block_size": 512, 00:20:19.935 "num_blocks": 65536, 00:20:19.935 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:19.935 "assigned_rate_limits": { 00:20:19.935 "rw_ios_per_sec": 0, 00:20:19.935 "rw_mbytes_per_sec": 0, 00:20:19.935 "r_mbytes_per_sec": 0, 00:20:19.935 "w_mbytes_per_sec": 0 00:20:19.935 }, 00:20:19.935 "claimed": true, 00:20:19.935 "claim_type": "exclusive_write", 00:20:19.935 "zoned": false, 00:20:19.935 "supported_io_types": { 00:20:19.935 "read": true, 00:20:19.935 "write": true, 00:20:19.935 "unmap": true, 00:20:19.935 "write_zeroes": true, 00:20:19.935 "flush": true, 00:20:19.935 "reset": true, 00:20:19.935 "compare": false, 00:20:19.935 "compare_and_write": false, 00:20:19.935 "abort": true, 00:20:19.935 "nvme_admin": false, 00:20:19.935 "nvme_io": false 00:20:19.935 }, 00:20:19.935 "memory_domains": [ 00:20:19.935 { 00:20:19.935 "dma_device_id": "system", 00:20:19.935 "dma_device_type": 1 00:20:19.935 }, 00:20:19.935 { 00:20:19.935 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:19.935 "dma_device_type": 2 00:20:19.935 } 00:20:19.935 ], 00:20:19.935 "driver_specific": {} 00:20:19.935 } 00:20:19.935 ] 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:19.935 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:20.194 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:20.194 "name": "Existed_Raid", 00:20:20.194 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:20.194 "strip_size_kb": 0, 00:20:20.194 "state": "configuring", 00:20:20.194 "raid_level": "raid1", 00:20:20.194 "superblock": true, 00:20:20.194 "num_base_bdevs": 4, 00:20:20.194 "num_base_bdevs_discovered": 3, 00:20:20.194 "num_base_bdevs_operational": 4, 00:20:20.194 "base_bdevs_list": [ 00:20:20.194 { 00:20:20.194 "name": "BaseBdev1", 00:20:20.194 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:20.194 "is_configured": true, 00:20:20.194 "data_offset": 2048, 00:20:20.194 "data_size": 63488 00:20:20.194 }, 00:20:20.194 { 00:20:20.194 "name": "BaseBdev2", 00:20:20.194 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:20.194 "is_configured": true, 00:20:20.195 "data_offset": 2048, 00:20:20.195 "data_size": 63488 00:20:20.195 }, 00:20:20.195 { 00:20:20.195 "name": "BaseBdev3", 00:20:20.195 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:20.195 "is_configured": true, 00:20:20.195 "data_offset": 2048, 00:20:20.195 "data_size": 63488 00:20:20.195 }, 00:20:20.195 { 00:20:20.195 "name": "BaseBdev4", 00:20:20.195 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:20.195 "is_configured": false, 00:20:20.195 "data_offset": 0, 00:20:20.195 "data_size": 0 00:20:20.195 } 00:20:20.195 ] 00:20:20.195 }' 00:20:20.195 15:59:25 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:20.195 15:59:25 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:21.131 15:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:21.132 [2024-06-10 15:59:26.528062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:21.132 [2024-06-10 15:59:26.528240] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x8e1ac0 00:20:21.132 [2024-06-10 15:59:26.528254] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:21.132 [2024-06-10 15:59:26.528442] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa86ba0 00:20:21.132 [2024-06-10 15:59:26.528579] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x8e1ac0 00:20:21.132 [2024-06-10 15:59:26.528588] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x8e1ac0 00:20:21.132 [2024-06-10 15:59:26.528683] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:21.132 BaseBdev4 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev4 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:21.132 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:21.390 15:59:26 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:21.649 [ 00:20:21.649 { 00:20:21.649 "name": "BaseBdev4", 00:20:21.649 "aliases": [ 00:20:21.649 "4e91245b-169c-48bd-9ad9-0dbd49945299" 00:20:21.649 ], 00:20:21.649 "product_name": "Malloc disk", 00:20:21.649 "block_size": 512, 00:20:21.649 "num_blocks": 65536, 00:20:21.649 "uuid": "4e91245b-169c-48bd-9ad9-0dbd49945299", 00:20:21.649 "assigned_rate_limits": { 00:20:21.649 "rw_ios_per_sec": 0, 00:20:21.649 "rw_mbytes_per_sec": 0, 00:20:21.649 "r_mbytes_per_sec": 0, 00:20:21.649 "w_mbytes_per_sec": 0 00:20:21.649 }, 00:20:21.649 "claimed": true, 00:20:21.649 "claim_type": "exclusive_write", 00:20:21.649 "zoned": false, 00:20:21.649 "supported_io_types": { 00:20:21.649 "read": true, 00:20:21.649 "write": true, 00:20:21.649 "unmap": true, 00:20:21.649 "write_zeroes": true, 00:20:21.649 "flush": true, 00:20:21.649 "reset": true, 00:20:21.649 "compare": false, 00:20:21.649 "compare_and_write": false, 00:20:21.649 "abort": true, 00:20:21.649 "nvme_admin": false, 00:20:21.649 "nvme_io": false 00:20:21.649 }, 00:20:21.649 "memory_domains": [ 00:20:21.649 { 00:20:21.649 "dma_device_id": "system", 00:20:21.649 "dma_device_type": 1 00:20:21.649 }, 00:20:21.649 { 00:20:21.649 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:21.649 "dma_device_type": 2 00:20:21.649 } 00:20:21.649 ], 00:20:21.649 "driver_specific": {} 00:20:21.649 } 00:20:21.649 ] 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:21.649 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:21.908 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:21.908 "name": "Existed_Raid", 00:20:21.908 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:21.908 "strip_size_kb": 0, 00:20:21.908 "state": "online", 00:20:21.908 "raid_level": "raid1", 00:20:21.908 "superblock": true, 00:20:21.908 "num_base_bdevs": 4, 00:20:21.908 "num_base_bdevs_discovered": 4, 00:20:21.908 "num_base_bdevs_operational": 4, 00:20:21.908 "base_bdevs_list": [ 00:20:21.908 { 00:20:21.908 "name": "BaseBdev1", 00:20:21.908 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:21.908 "is_configured": true, 00:20:21.908 "data_offset": 2048, 00:20:21.908 "data_size": 63488 00:20:21.908 }, 00:20:21.908 { 00:20:21.908 "name": "BaseBdev2", 00:20:21.908 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:21.908 "is_configured": true, 00:20:21.908 "data_offset": 2048, 00:20:21.908 "data_size": 63488 00:20:21.908 }, 00:20:21.908 { 00:20:21.908 "name": "BaseBdev3", 00:20:21.908 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:21.908 "is_configured": true, 00:20:21.908 "data_offset": 2048, 00:20:21.908 "data_size": 63488 00:20:21.908 }, 00:20:21.908 { 00:20:21.908 "name": "BaseBdev4", 00:20:21.908 "uuid": "4e91245b-169c-48bd-9ad9-0dbd49945299", 00:20:21.908 "is_configured": true, 00:20:21.908 "data_offset": 2048, 00:20:21.908 "data_size": 63488 00:20:21.908 } 00:20:21.908 ] 00:20:21.908 }' 00:20:21.908 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:21.908 15:59:27 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:22.476 15:59:27 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:22.734 [2024-06-10 15:59:28.168776] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:22.734 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:22.734 "name": "Existed_Raid", 00:20:22.734 "aliases": [ 00:20:22.734 "88f9dd78-ee66-40ba-ae50-fba051f340b5" 00:20:22.734 ], 00:20:22.735 "product_name": "Raid Volume", 00:20:22.735 "block_size": 512, 00:20:22.735 "num_blocks": 63488, 00:20:22.735 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:22.735 "assigned_rate_limits": { 00:20:22.735 "rw_ios_per_sec": 0, 00:20:22.735 "rw_mbytes_per_sec": 0, 00:20:22.735 "r_mbytes_per_sec": 0, 00:20:22.735 "w_mbytes_per_sec": 0 00:20:22.735 }, 00:20:22.735 "claimed": false, 00:20:22.735 "zoned": false, 00:20:22.735 "supported_io_types": { 00:20:22.735 "read": true, 00:20:22.735 "write": true, 00:20:22.735 "unmap": false, 00:20:22.735 "write_zeroes": true, 00:20:22.735 "flush": false, 00:20:22.735 "reset": true, 00:20:22.735 "compare": false, 00:20:22.735 "compare_and_write": false, 00:20:22.735 "abort": false, 00:20:22.735 "nvme_admin": false, 00:20:22.735 "nvme_io": false 00:20:22.735 }, 00:20:22.735 "memory_domains": [ 00:20:22.735 { 00:20:22.735 "dma_device_id": "system", 00:20:22.735 "dma_device_type": 1 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.735 "dma_device_type": 2 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "system", 00:20:22.735 "dma_device_type": 1 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.735 "dma_device_type": 2 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "system", 00:20:22.735 "dma_device_type": 1 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.735 "dma_device_type": 2 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "system", 00:20:22.735 "dma_device_type": 1 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:22.735 "dma_device_type": 2 00:20:22.735 } 00:20:22.735 ], 00:20:22.735 "driver_specific": { 00:20:22.735 "raid": { 00:20:22.735 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:22.735 "strip_size_kb": 0, 00:20:22.735 "state": "online", 00:20:22.735 "raid_level": "raid1", 00:20:22.735 "superblock": true, 00:20:22.735 "num_base_bdevs": 4, 00:20:22.735 "num_base_bdevs_discovered": 4, 00:20:22.735 "num_base_bdevs_operational": 4, 00:20:22.735 "base_bdevs_list": [ 00:20:22.735 { 00:20:22.735 "name": "BaseBdev1", 00:20:22.735 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:22.735 "is_configured": true, 00:20:22.735 "data_offset": 2048, 00:20:22.735 "data_size": 63488 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "name": "BaseBdev2", 00:20:22.735 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:22.735 "is_configured": true, 00:20:22.735 "data_offset": 2048, 00:20:22.735 "data_size": 63488 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "name": "BaseBdev3", 00:20:22.735 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:22.735 "is_configured": true, 00:20:22.735 "data_offset": 2048, 00:20:22.735 "data_size": 63488 00:20:22.735 }, 00:20:22.735 { 00:20:22.735 "name": "BaseBdev4", 00:20:22.735 "uuid": "4e91245b-169c-48bd-9ad9-0dbd49945299", 00:20:22.735 "is_configured": true, 00:20:22.735 "data_offset": 2048, 00:20:22.735 "data_size": 63488 00:20:22.735 } 00:20:22.735 ] 00:20:22.735 } 00:20:22.735 } 00:20:22.735 }' 00:20:22.735 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:22.735 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:20:22.735 BaseBdev2 00:20:22.735 BaseBdev3 00:20:22.735 BaseBdev4' 00:20:22.735 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:22.994 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:20:22.994 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.252 "name": "BaseBdev1", 00:20:23.252 "aliases": [ 00:20:23.252 "1aa1873b-22e9-4a94-90ee-334cbcf703e2" 00:20:23.252 ], 00:20:23.252 "product_name": "Malloc disk", 00:20:23.252 "block_size": 512, 00:20:23.252 "num_blocks": 65536, 00:20:23.252 "uuid": "1aa1873b-22e9-4a94-90ee-334cbcf703e2", 00:20:23.252 "assigned_rate_limits": { 00:20:23.252 "rw_ios_per_sec": 0, 00:20:23.252 "rw_mbytes_per_sec": 0, 00:20:23.252 "r_mbytes_per_sec": 0, 00:20:23.252 "w_mbytes_per_sec": 0 00:20:23.252 }, 00:20:23.252 "claimed": true, 00:20:23.252 "claim_type": "exclusive_write", 00:20:23.252 "zoned": false, 00:20:23.252 "supported_io_types": { 00:20:23.252 "read": true, 00:20:23.252 "write": true, 00:20:23.252 "unmap": true, 00:20:23.252 "write_zeroes": true, 00:20:23.252 "flush": true, 00:20:23.252 "reset": true, 00:20:23.252 "compare": false, 00:20:23.252 "compare_and_write": false, 00:20:23.252 "abort": true, 00:20:23.252 "nvme_admin": false, 00:20:23.252 "nvme_io": false 00:20:23.252 }, 00:20:23.252 "memory_domains": [ 00:20:23.252 { 00:20:23.252 "dma_device_id": "system", 00:20:23.252 "dma_device_type": 1 00:20:23.252 }, 00:20:23.252 { 00:20:23.252 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.252 "dma_device_type": 2 00:20:23.252 } 00:20:23.252 ], 00:20:23.252 "driver_specific": {} 00:20:23.252 }' 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:23.252 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.253 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:23.511 15:59:28 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:23.769 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:23.769 "name": "BaseBdev2", 00:20:23.769 "aliases": [ 00:20:23.769 "92fe35a0-ee37-4d52-9450-8daef206c33b" 00:20:23.769 ], 00:20:23.769 "product_name": "Malloc disk", 00:20:23.769 "block_size": 512, 00:20:23.769 "num_blocks": 65536, 00:20:23.769 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:23.769 "assigned_rate_limits": { 00:20:23.769 "rw_ios_per_sec": 0, 00:20:23.769 "rw_mbytes_per_sec": 0, 00:20:23.769 "r_mbytes_per_sec": 0, 00:20:23.769 "w_mbytes_per_sec": 0 00:20:23.769 }, 00:20:23.769 "claimed": true, 00:20:23.769 "claim_type": "exclusive_write", 00:20:23.769 "zoned": false, 00:20:23.769 "supported_io_types": { 00:20:23.769 "read": true, 00:20:23.769 "write": true, 00:20:23.769 "unmap": true, 00:20:23.769 "write_zeroes": true, 00:20:23.769 "flush": true, 00:20:23.769 "reset": true, 00:20:23.769 "compare": false, 00:20:23.769 "compare_and_write": false, 00:20:23.769 "abort": true, 00:20:23.769 "nvme_admin": false, 00:20:23.769 "nvme_io": false 00:20:23.769 }, 00:20:23.769 "memory_domains": [ 00:20:23.769 { 00:20:23.769 "dma_device_id": "system", 00:20:23.769 "dma_device_type": 1 00:20:23.769 }, 00:20:23.769 { 00:20:23.769 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:23.769 "dma_device_type": 2 00:20:23.769 } 00:20:23.769 ], 00:20:23.769 "driver_specific": {} 00:20:23.769 }' 00:20:23.769 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.769 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:23.769 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:23.769 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:24.028 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:24.322 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:24.322 "name": "BaseBdev3", 00:20:24.322 "aliases": [ 00:20:24.322 "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b" 00:20:24.322 ], 00:20:24.322 "product_name": "Malloc disk", 00:20:24.322 "block_size": 512, 00:20:24.322 "num_blocks": 65536, 00:20:24.322 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:24.322 "assigned_rate_limits": { 00:20:24.322 "rw_ios_per_sec": 0, 00:20:24.322 "rw_mbytes_per_sec": 0, 00:20:24.322 "r_mbytes_per_sec": 0, 00:20:24.322 "w_mbytes_per_sec": 0 00:20:24.322 }, 00:20:24.322 "claimed": true, 00:20:24.322 "claim_type": "exclusive_write", 00:20:24.322 "zoned": false, 00:20:24.322 "supported_io_types": { 00:20:24.322 "read": true, 00:20:24.322 "write": true, 00:20:24.322 "unmap": true, 00:20:24.322 "write_zeroes": true, 00:20:24.322 "flush": true, 00:20:24.322 "reset": true, 00:20:24.322 "compare": false, 00:20:24.322 "compare_and_write": false, 00:20:24.322 "abort": true, 00:20:24.322 "nvme_admin": false, 00:20:24.322 "nvme_io": false 00:20:24.322 }, 00:20:24.322 "memory_domains": [ 00:20:24.322 { 00:20:24.322 "dma_device_id": "system", 00:20:24.322 "dma_device_type": 1 00:20:24.322 }, 00:20:24.322 { 00:20:24.322 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:24.322 "dma_device_type": 2 00:20:24.322 } 00:20:24.322 ], 00:20:24.322 "driver_specific": {} 00:20:24.322 }' 00:20:24.322 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.322 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.581 15:59:29 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:24.581 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:24.581 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.581 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:24.840 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:24.840 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:24.840 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:24.840 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:25.099 "name": "BaseBdev4", 00:20:25.099 "aliases": [ 00:20:25.099 "4e91245b-169c-48bd-9ad9-0dbd49945299" 00:20:25.099 ], 00:20:25.099 "product_name": "Malloc disk", 00:20:25.099 "block_size": 512, 00:20:25.099 "num_blocks": 65536, 00:20:25.099 "uuid": "4e91245b-169c-48bd-9ad9-0dbd49945299", 00:20:25.099 "assigned_rate_limits": { 00:20:25.099 "rw_ios_per_sec": 0, 00:20:25.099 "rw_mbytes_per_sec": 0, 00:20:25.099 "r_mbytes_per_sec": 0, 00:20:25.099 "w_mbytes_per_sec": 0 00:20:25.099 }, 00:20:25.099 "claimed": true, 00:20:25.099 "claim_type": "exclusive_write", 00:20:25.099 "zoned": false, 00:20:25.099 "supported_io_types": { 00:20:25.099 "read": true, 00:20:25.099 "write": true, 00:20:25.099 "unmap": true, 00:20:25.099 "write_zeroes": true, 00:20:25.099 "flush": true, 00:20:25.099 "reset": true, 00:20:25.099 "compare": false, 00:20:25.099 "compare_and_write": false, 00:20:25.099 "abort": true, 00:20:25.099 "nvme_admin": false, 00:20:25.099 "nvme_io": false 00:20:25.099 }, 00:20:25.099 "memory_domains": [ 00:20:25.099 { 00:20:25.099 "dma_device_id": "system", 00:20:25.099 "dma_device_type": 1 00:20:25.099 }, 00:20:25.099 { 00:20:25.099 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:25.099 "dma_device_type": 2 00:20:25.099 } 00:20:25.099 ], 00:20:25.099 "driver_specific": {} 00:20:25.099 }' 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.099 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:25.358 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:25.358 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.358 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:25.358 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:25.358 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:25.617 [2024-06-10 15:59:30.964035] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@275 -- # local expected_state 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@213 -- # case $1 in 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@214 -- # return 0 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 3 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:25.617 15:59:30 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:25.876 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:25.876 "name": "Existed_Raid", 00:20:25.876 "uuid": "88f9dd78-ee66-40ba-ae50-fba051f340b5", 00:20:25.876 "strip_size_kb": 0, 00:20:25.876 "state": "online", 00:20:25.876 "raid_level": "raid1", 00:20:25.876 "superblock": true, 00:20:25.876 "num_base_bdevs": 4, 00:20:25.876 "num_base_bdevs_discovered": 3, 00:20:25.876 "num_base_bdevs_operational": 3, 00:20:25.876 "base_bdevs_list": [ 00:20:25.876 { 00:20:25.876 "name": null, 00:20:25.876 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:25.876 "is_configured": false, 00:20:25.876 "data_offset": 2048, 00:20:25.876 "data_size": 63488 00:20:25.876 }, 00:20:25.876 { 00:20:25.876 "name": "BaseBdev2", 00:20:25.876 "uuid": "92fe35a0-ee37-4d52-9450-8daef206c33b", 00:20:25.876 "is_configured": true, 00:20:25.877 "data_offset": 2048, 00:20:25.877 "data_size": 63488 00:20:25.877 }, 00:20:25.877 { 00:20:25.877 "name": "BaseBdev3", 00:20:25.877 "uuid": "3a8aa135-a09d-4c26-9fa3-d77ee5243a6b", 00:20:25.877 "is_configured": true, 00:20:25.877 "data_offset": 2048, 00:20:25.877 "data_size": 63488 00:20:25.877 }, 00:20:25.877 { 00:20:25.877 "name": "BaseBdev4", 00:20:25.877 "uuid": "4e91245b-169c-48bd-9ad9-0dbd49945299", 00:20:25.877 "is_configured": true, 00:20:25.877 "data_offset": 2048, 00:20:25.877 "data_size": 63488 00:20:25.877 } 00:20:25.877 ] 00:20:25.877 }' 00:20:25.877 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:25.877 15:59:31 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:26.444 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:20:26.444 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.444 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.444 15:59:31 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:26.709 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:26.709 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:26.709 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:20:26.970 [2024-06-10 15:59:32.300757] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:26.970 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:26.970 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:26.970 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:26.970 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:27.230 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:27.230 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:27.230 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev3 00:20:27.489 [2024-06-10 15:59:32.824501] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:27.489 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:27.489 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:27.489 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:27.489 15:59:32 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:20:27.748 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:20:27.748 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:20:27.748 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev4 00:20:28.007 [2024-06-10 15:59:33.340387] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev4 00:20:28.007 [2024-06-10 15:59:33.340472] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:28.007 [2024-06-10 15:59:33.351022] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:28.007 [2024-06-10 15:59:33.351054] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:28.007 [2024-06-10 15:59:33.351062] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8e1ac0 name Existed_Raid, state offline 00:20:28.007 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:20:28.007 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:20:28.007 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:28.007 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@299 -- # '[' 4 -gt 2 ']' 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i = 1 )) 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:28.266 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2 00:20:28.525 BaseBdev2 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev2 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:28.525 15:59:33 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:28.783 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:20:29.042 [ 00:20:29.042 { 00:20:29.042 "name": "BaseBdev2", 00:20:29.042 "aliases": [ 00:20:29.042 "d80941a2-6fc9-46b1-885a-d26e915d696d" 00:20:29.042 ], 00:20:29.042 "product_name": "Malloc disk", 00:20:29.042 "block_size": 512, 00:20:29.042 "num_blocks": 65536, 00:20:29.042 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:29.042 "assigned_rate_limits": { 00:20:29.042 "rw_ios_per_sec": 0, 00:20:29.042 "rw_mbytes_per_sec": 0, 00:20:29.042 "r_mbytes_per_sec": 0, 00:20:29.042 "w_mbytes_per_sec": 0 00:20:29.042 }, 00:20:29.042 "claimed": false, 00:20:29.042 "zoned": false, 00:20:29.042 "supported_io_types": { 00:20:29.042 "read": true, 00:20:29.042 "write": true, 00:20:29.042 "unmap": true, 00:20:29.042 "write_zeroes": true, 00:20:29.042 "flush": true, 00:20:29.042 "reset": true, 00:20:29.042 "compare": false, 00:20:29.042 "compare_and_write": false, 00:20:29.042 "abort": true, 00:20:29.042 "nvme_admin": false, 00:20:29.042 "nvme_io": false 00:20:29.042 }, 00:20:29.042 "memory_domains": [ 00:20:29.042 { 00:20:29.042 "dma_device_id": "system", 00:20:29.042 "dma_device_type": 1 00:20:29.042 }, 00:20:29.042 { 00:20:29.042 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.042 "dma_device_type": 2 00:20:29.042 } 00:20:29.042 ], 00:20:29.042 "driver_specific": {} 00:20:29.042 } 00:20:29.042 ] 00:20:29.042 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:29.042 15:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:29.042 15:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:29.042 15:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3 00:20:29.306 BaseBdev3 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev3 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev3 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:29.306 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:29.566 15:59:34 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 -t 2000 00:20:29.825 [ 00:20:29.825 { 00:20:29.825 "name": "BaseBdev3", 00:20:29.825 "aliases": [ 00:20:29.825 "7d2d7dca-469b-48e1-ae57-e1bd571c4e42" 00:20:29.825 ], 00:20:29.825 "product_name": "Malloc disk", 00:20:29.825 "block_size": 512, 00:20:29.825 "num_blocks": 65536, 00:20:29.825 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:29.825 "assigned_rate_limits": { 00:20:29.825 "rw_ios_per_sec": 0, 00:20:29.825 "rw_mbytes_per_sec": 0, 00:20:29.825 "r_mbytes_per_sec": 0, 00:20:29.825 "w_mbytes_per_sec": 0 00:20:29.825 }, 00:20:29.825 "claimed": false, 00:20:29.825 "zoned": false, 00:20:29.825 "supported_io_types": { 00:20:29.825 "read": true, 00:20:29.825 "write": true, 00:20:29.825 "unmap": true, 00:20:29.825 "write_zeroes": true, 00:20:29.825 "flush": true, 00:20:29.825 "reset": true, 00:20:29.825 "compare": false, 00:20:29.825 "compare_and_write": false, 00:20:29.825 "abort": true, 00:20:29.825 "nvme_admin": false, 00:20:29.825 "nvme_io": false 00:20:29.825 }, 00:20:29.825 "memory_domains": [ 00:20:29.825 { 00:20:29.825 "dma_device_id": "system", 00:20:29.825 "dma_device_type": 1 00:20:29.825 }, 00:20:29.825 { 00:20:29.825 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:29.825 "dma_device_type": 2 00:20:29.825 } 00:20:29.825 ], 00:20:29.825 "driver_specific": {} 00:20:29.825 } 00:20:29.825 ] 00:20:29.825 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:29.825 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:29.825 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:29.825 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@302 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4 00:20:30.088 BaseBdev4 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@303 -- # waitforbdev BaseBdev4 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev4 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:30.088 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:30.347 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 -t 2000 00:20:30.606 [ 00:20:30.606 { 00:20:30.606 "name": "BaseBdev4", 00:20:30.606 "aliases": [ 00:20:30.606 "6d2f403c-9a37-4356-881b-5c785b6aa1a2" 00:20:30.606 ], 00:20:30.606 "product_name": "Malloc disk", 00:20:30.606 "block_size": 512, 00:20:30.606 "num_blocks": 65536, 00:20:30.606 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:30.606 "assigned_rate_limits": { 00:20:30.606 "rw_ios_per_sec": 0, 00:20:30.606 "rw_mbytes_per_sec": 0, 00:20:30.606 "r_mbytes_per_sec": 0, 00:20:30.606 "w_mbytes_per_sec": 0 00:20:30.606 }, 00:20:30.606 "claimed": false, 00:20:30.606 "zoned": false, 00:20:30.606 "supported_io_types": { 00:20:30.606 "read": true, 00:20:30.606 "write": true, 00:20:30.606 "unmap": true, 00:20:30.606 "write_zeroes": true, 00:20:30.606 "flush": true, 00:20:30.606 "reset": true, 00:20:30.606 "compare": false, 00:20:30.606 "compare_and_write": false, 00:20:30.606 "abort": true, 00:20:30.606 "nvme_admin": false, 00:20:30.606 "nvme_io": false 00:20:30.606 }, 00:20:30.606 "memory_domains": [ 00:20:30.606 { 00:20:30.606 "dma_device_id": "system", 00:20:30.606 "dma_device_type": 1 00:20:30.606 }, 00:20:30.606 { 00:20:30.606 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:30.606 "dma_device_type": 2 00:20:30.606 } 00:20:30.606 ], 00:20:30.606 "driver_specific": {} 00:20:30.606 } 00:20:30.606 ] 00:20:30.606 15:59:35 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:30.606 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i++ )) 00:20:30.606 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@301 -- # (( i < num_base_bdevs )) 00:20:30.606 15:59:35 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@305 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n Existed_Raid 00:20:30.864 [2024-06-10 15:59:36.155555] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:20:30.865 [2024-06-10 15:59:36.155593] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:20:30.865 [2024-06-10 15:59:36.155609] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:30.865 [2024-06-10 15:59:36.157020] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:30.865 [2024-06-10 15:59:36.157062] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@306 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:30.865 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:31.123 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:31.123 "name": "Existed_Raid", 00:20:31.124 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:31.124 "strip_size_kb": 0, 00:20:31.124 "state": "configuring", 00:20:31.124 "raid_level": "raid1", 00:20:31.124 "superblock": true, 00:20:31.124 "num_base_bdevs": 4, 00:20:31.124 "num_base_bdevs_discovered": 3, 00:20:31.124 "num_base_bdevs_operational": 4, 00:20:31.124 "base_bdevs_list": [ 00:20:31.124 { 00:20:31.124 "name": "BaseBdev1", 00:20:31.124 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:31.124 "is_configured": false, 00:20:31.124 "data_offset": 0, 00:20:31.124 "data_size": 0 00:20:31.124 }, 00:20:31.124 { 00:20:31.124 "name": "BaseBdev2", 00:20:31.124 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:31.124 "is_configured": true, 00:20:31.124 "data_offset": 2048, 00:20:31.124 "data_size": 63488 00:20:31.124 }, 00:20:31.124 { 00:20:31.124 "name": "BaseBdev3", 00:20:31.124 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:31.124 "is_configured": true, 00:20:31.124 "data_offset": 2048, 00:20:31.124 "data_size": 63488 00:20:31.124 }, 00:20:31.124 { 00:20:31.124 "name": "BaseBdev4", 00:20:31.124 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:31.124 "is_configured": true, 00:20:31.124 "data_offset": 2048, 00:20:31.124 "data_size": 63488 00:20:31.124 } 00:20:31.124 ] 00:20:31.124 }' 00:20:31.124 15:59:36 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:31.124 15:59:36 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:31.690 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@308 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:20:31.949 [2024-06-10 15:59:37.290578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@309 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:31.949 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:32.208 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:32.208 "name": "Existed_Raid", 00:20:32.208 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:32.208 "strip_size_kb": 0, 00:20:32.208 "state": "configuring", 00:20:32.208 "raid_level": "raid1", 00:20:32.208 "superblock": true, 00:20:32.208 "num_base_bdevs": 4, 00:20:32.208 "num_base_bdevs_discovered": 2, 00:20:32.208 "num_base_bdevs_operational": 4, 00:20:32.208 "base_bdevs_list": [ 00:20:32.208 { 00:20:32.208 "name": "BaseBdev1", 00:20:32.208 "uuid": "00000000-0000-0000-0000-000000000000", 00:20:32.208 "is_configured": false, 00:20:32.208 "data_offset": 0, 00:20:32.208 "data_size": 0 00:20:32.208 }, 00:20:32.208 { 00:20:32.208 "name": null, 00:20:32.208 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:32.208 "is_configured": false, 00:20:32.208 "data_offset": 2048, 00:20:32.208 "data_size": 63488 00:20:32.208 }, 00:20:32.208 { 00:20:32.208 "name": "BaseBdev3", 00:20:32.208 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:32.208 "is_configured": true, 00:20:32.208 "data_offset": 2048, 00:20:32.208 "data_size": 63488 00:20:32.208 }, 00:20:32.208 { 00:20:32.208 "name": "BaseBdev4", 00:20:32.208 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:32.208 "is_configured": true, 00:20:32.208 "data_offset": 2048, 00:20:32.208 "data_size": 63488 00:20:32.208 } 00:20:32.208 ] 00:20:32.208 }' 00:20:32.208 15:59:37 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:32.208 15:59:37 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:32.775 15:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:32.775 15:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:33.033 15:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@310 -- # [[ false == \f\a\l\s\e ]] 00:20:33.033 15:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@312 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1 00:20:33.292 [2024-06-10 15:59:38.605391] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:20:33.292 BaseBdev1 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@313 -- # waitforbdev BaseBdev1 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:33.292 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:33.550 15:59:38 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:20:33.809 [ 00:20:33.809 { 00:20:33.809 "name": "BaseBdev1", 00:20:33.809 "aliases": [ 00:20:33.809 "18aff003-3b26-4a25-9068-dd56d2b11de2" 00:20:33.809 ], 00:20:33.809 "product_name": "Malloc disk", 00:20:33.809 "block_size": 512, 00:20:33.809 "num_blocks": 65536, 00:20:33.809 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:33.809 "assigned_rate_limits": { 00:20:33.809 "rw_ios_per_sec": 0, 00:20:33.809 "rw_mbytes_per_sec": 0, 00:20:33.809 "r_mbytes_per_sec": 0, 00:20:33.809 "w_mbytes_per_sec": 0 00:20:33.809 }, 00:20:33.809 "claimed": true, 00:20:33.809 "claim_type": "exclusive_write", 00:20:33.809 "zoned": false, 00:20:33.809 "supported_io_types": { 00:20:33.809 "read": true, 00:20:33.809 "write": true, 00:20:33.809 "unmap": true, 00:20:33.809 "write_zeroes": true, 00:20:33.809 "flush": true, 00:20:33.809 "reset": true, 00:20:33.809 "compare": false, 00:20:33.809 "compare_and_write": false, 00:20:33.809 "abort": true, 00:20:33.809 "nvme_admin": false, 00:20:33.809 "nvme_io": false 00:20:33.809 }, 00:20:33.809 "memory_domains": [ 00:20:33.809 { 00:20:33.809 "dma_device_id": "system", 00:20:33.809 "dma_device_type": 1 00:20:33.809 }, 00:20:33.809 { 00:20:33.809 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:33.809 "dma_device_type": 2 00:20:33.809 } 00:20:33.809 ], 00:20:33.809 "driver_specific": {} 00:20:33.809 } 00:20:33.809 ] 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@314 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:33.809 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:34.067 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:34.067 "name": "Existed_Raid", 00:20:34.067 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:34.067 "strip_size_kb": 0, 00:20:34.067 "state": "configuring", 00:20:34.067 "raid_level": "raid1", 00:20:34.067 "superblock": true, 00:20:34.067 "num_base_bdevs": 4, 00:20:34.067 "num_base_bdevs_discovered": 3, 00:20:34.067 "num_base_bdevs_operational": 4, 00:20:34.067 "base_bdevs_list": [ 00:20:34.067 { 00:20:34.067 "name": "BaseBdev1", 00:20:34.067 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:34.067 "is_configured": true, 00:20:34.067 "data_offset": 2048, 00:20:34.067 "data_size": 63488 00:20:34.067 }, 00:20:34.067 { 00:20:34.067 "name": null, 00:20:34.067 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:34.067 "is_configured": false, 00:20:34.067 "data_offset": 2048, 00:20:34.067 "data_size": 63488 00:20:34.067 }, 00:20:34.067 { 00:20:34.067 "name": "BaseBdev3", 00:20:34.067 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:34.067 "is_configured": true, 00:20:34.067 "data_offset": 2048, 00:20:34.067 "data_size": 63488 00:20:34.067 }, 00:20:34.067 { 00:20:34.067 "name": "BaseBdev4", 00:20:34.067 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:34.067 "is_configured": true, 00:20:34.067 "data_offset": 2048, 00:20:34.067 "data_size": 63488 00:20:34.067 } 00:20:34.067 ] 00:20:34.068 }' 00:20:34.068 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:34.068 15:59:39 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:34.634 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:34.634 15:59:39 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:34.892 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@315 -- # [[ true == \t\r\u\e ]] 00:20:34.892 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev3 00:20:35.151 [2024-06-10 15:59:40.470424] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev3 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@318 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.151 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:35.409 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:35.409 "name": "Existed_Raid", 00:20:35.409 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:35.409 "strip_size_kb": 0, 00:20:35.409 "state": "configuring", 00:20:35.409 "raid_level": "raid1", 00:20:35.409 "superblock": true, 00:20:35.409 "num_base_bdevs": 4, 00:20:35.409 "num_base_bdevs_discovered": 2, 00:20:35.409 "num_base_bdevs_operational": 4, 00:20:35.409 "base_bdevs_list": [ 00:20:35.409 { 00:20:35.409 "name": "BaseBdev1", 00:20:35.409 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:35.409 "is_configured": true, 00:20:35.409 "data_offset": 2048, 00:20:35.409 "data_size": 63488 00:20:35.409 }, 00:20:35.409 { 00:20:35.409 "name": null, 00:20:35.409 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:35.409 "is_configured": false, 00:20:35.409 "data_offset": 2048, 00:20:35.409 "data_size": 63488 00:20:35.409 }, 00:20:35.409 { 00:20:35.409 "name": null, 00:20:35.409 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:35.409 "is_configured": false, 00:20:35.409 "data_offset": 2048, 00:20:35.409 "data_size": 63488 00:20:35.409 }, 00:20:35.409 { 00:20:35.409 "name": "BaseBdev4", 00:20:35.409 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:35.409 "is_configured": true, 00:20:35.409 "data_offset": 2048, 00:20:35.409 "data_size": 63488 00:20:35.409 } 00:20:35.409 ] 00:20:35.409 }' 00:20:35.409 15:59:40 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:35.409 15:59:40 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:35.974 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:35.974 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:36.232 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@319 -- # [[ false == \f\a\l\s\e ]] 00:20:36.232 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@321 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev3 00:20:36.491 [2024-06-10 15:59:41.826066] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@322 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:36.491 15:59:41 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:36.750 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:36.750 "name": "Existed_Raid", 00:20:36.750 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:36.750 "strip_size_kb": 0, 00:20:36.750 "state": "configuring", 00:20:36.750 "raid_level": "raid1", 00:20:36.750 "superblock": true, 00:20:36.750 "num_base_bdevs": 4, 00:20:36.750 "num_base_bdevs_discovered": 3, 00:20:36.750 "num_base_bdevs_operational": 4, 00:20:36.750 "base_bdevs_list": [ 00:20:36.750 { 00:20:36.750 "name": "BaseBdev1", 00:20:36.750 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:36.750 "is_configured": true, 00:20:36.750 "data_offset": 2048, 00:20:36.750 "data_size": 63488 00:20:36.750 }, 00:20:36.750 { 00:20:36.750 "name": null, 00:20:36.750 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:36.750 "is_configured": false, 00:20:36.750 "data_offset": 2048, 00:20:36.750 "data_size": 63488 00:20:36.750 }, 00:20:36.750 { 00:20:36.750 "name": "BaseBdev3", 00:20:36.750 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:36.750 "is_configured": true, 00:20:36.750 "data_offset": 2048, 00:20:36.750 "data_size": 63488 00:20:36.750 }, 00:20:36.750 { 00:20:36.750 "name": "BaseBdev4", 00:20:36.750 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:36.750 "is_configured": true, 00:20:36.750 "data_offset": 2048, 00:20:36.750 "data_size": 63488 00:20:36.750 } 00:20:36.750 ] 00:20:36.750 }' 00:20:36.750 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:36.750 15:59:42 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:37.318 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.318 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # jq '.[0].base_bdevs_list[2].is_configured' 00:20:37.576 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@323 -- # [[ true == \t\r\u\e ]] 00:20:37.577 15:59:42 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@325 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:20:37.835 [2024-06-10 15:59:43.189726] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@326 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:37.835 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:38.094 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:38.094 "name": "Existed_Raid", 00:20:38.094 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:38.094 "strip_size_kb": 0, 00:20:38.094 "state": "configuring", 00:20:38.094 "raid_level": "raid1", 00:20:38.094 "superblock": true, 00:20:38.094 "num_base_bdevs": 4, 00:20:38.094 "num_base_bdevs_discovered": 2, 00:20:38.094 "num_base_bdevs_operational": 4, 00:20:38.094 "base_bdevs_list": [ 00:20:38.094 { 00:20:38.094 "name": null, 00:20:38.094 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:38.094 "is_configured": false, 00:20:38.094 "data_offset": 2048, 00:20:38.094 "data_size": 63488 00:20:38.094 }, 00:20:38.094 { 00:20:38.094 "name": null, 00:20:38.094 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:38.094 "is_configured": false, 00:20:38.094 "data_offset": 2048, 00:20:38.094 "data_size": 63488 00:20:38.094 }, 00:20:38.094 { 00:20:38.094 "name": "BaseBdev3", 00:20:38.094 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:38.094 "is_configured": true, 00:20:38.094 "data_offset": 2048, 00:20:38.094 "data_size": 63488 00:20:38.094 }, 00:20:38.094 { 00:20:38.094 "name": "BaseBdev4", 00:20:38.094 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:38.094 "is_configured": true, 00:20:38.094 "data_offset": 2048, 00:20:38.095 "data_size": 63488 00:20:38.095 } 00:20:38.095 ] 00:20:38.095 }' 00:20:38.095 15:59:43 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:38.095 15:59:43 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:38.662 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # jq '.[0].base_bdevs_list[0].is_configured' 00:20:38.662 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:38.921 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@327 -- # [[ false == \f\a\l\s\e ]] 00:20:38.921 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@329 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev Existed_Raid BaseBdev2 00:20:39.180 [2024-06-10 15:59:44.583808] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@330 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 4 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:39.180 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:39.181 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:39.181 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:39.181 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:39.181 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:39.439 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:39.440 "name": "Existed_Raid", 00:20:39.440 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:39.440 "strip_size_kb": 0, 00:20:39.440 "state": "configuring", 00:20:39.440 "raid_level": "raid1", 00:20:39.440 "superblock": true, 00:20:39.440 "num_base_bdevs": 4, 00:20:39.440 "num_base_bdevs_discovered": 3, 00:20:39.440 "num_base_bdevs_operational": 4, 00:20:39.440 "base_bdevs_list": [ 00:20:39.440 { 00:20:39.440 "name": null, 00:20:39.440 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:39.440 "is_configured": false, 00:20:39.440 "data_offset": 2048, 00:20:39.440 "data_size": 63488 00:20:39.440 }, 00:20:39.440 { 00:20:39.440 "name": "BaseBdev2", 00:20:39.440 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:39.440 "is_configured": true, 00:20:39.440 "data_offset": 2048, 00:20:39.440 "data_size": 63488 00:20:39.440 }, 00:20:39.440 { 00:20:39.440 "name": "BaseBdev3", 00:20:39.440 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:39.440 "is_configured": true, 00:20:39.440 "data_offset": 2048, 00:20:39.440 "data_size": 63488 00:20:39.440 }, 00:20:39.440 { 00:20:39.440 "name": "BaseBdev4", 00:20:39.440 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:39.440 "is_configured": true, 00:20:39.440 "data_offset": 2048, 00:20:39.440 "data_size": 63488 00:20:39.440 } 00:20:39.440 ] 00:20:39.440 }' 00:20:39.440 15:59:44 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:39.440 15:59:44 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:40.007 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.007 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # jq '.[0].base_bdevs_list[1].is_configured' 00:20:40.266 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@331 -- # [[ true == \t\r\u\e ]] 00:20:40.266 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:40.266 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # jq -r '.[0].base_bdevs_list[0].uuid' 00:20:40.524 15:59:45 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@333 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b NewBaseBdev -u 18aff003-3b26-4a25-9068-dd56d2b11de2 00:20:40.827 [2024-06-10 15:59:46.111170] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev NewBaseBdev is claimed 00:20:40.827 [2024-06-10 15:59:46.111326] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa85510 00:20:40.827 [2024-06-10 15:59:46.111338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:40.827 [2024-06-10 15:59:46.111526] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa90b40 00:20:40.827 [2024-06-10 15:59:46.111662] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa85510 00:20:40.827 [2024-06-10 15:59:46.111670] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0xa85510 00:20:40.827 [2024-06-10 15:59:46.111769] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:40.827 NewBaseBdev 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@334 -- # waitforbdev NewBaseBdev 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@898 -- # local bdev_name=NewBaseBdev 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@900 -- # local i 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:20:40.827 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:20:41.087 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev -t 2000 00:20:41.347 [ 00:20:41.347 { 00:20:41.347 "name": "NewBaseBdev", 00:20:41.347 "aliases": [ 00:20:41.347 "18aff003-3b26-4a25-9068-dd56d2b11de2" 00:20:41.347 ], 00:20:41.347 "product_name": "Malloc disk", 00:20:41.347 "block_size": 512, 00:20:41.347 "num_blocks": 65536, 00:20:41.347 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:41.347 "assigned_rate_limits": { 00:20:41.347 "rw_ios_per_sec": 0, 00:20:41.347 "rw_mbytes_per_sec": 0, 00:20:41.347 "r_mbytes_per_sec": 0, 00:20:41.347 "w_mbytes_per_sec": 0 00:20:41.347 }, 00:20:41.347 "claimed": true, 00:20:41.347 "claim_type": "exclusive_write", 00:20:41.347 "zoned": false, 00:20:41.347 "supported_io_types": { 00:20:41.347 "read": true, 00:20:41.347 "write": true, 00:20:41.347 "unmap": true, 00:20:41.347 "write_zeroes": true, 00:20:41.347 "flush": true, 00:20:41.347 "reset": true, 00:20:41.347 "compare": false, 00:20:41.347 "compare_and_write": false, 00:20:41.347 "abort": true, 00:20:41.347 "nvme_admin": false, 00:20:41.347 "nvme_io": false 00:20:41.347 }, 00:20:41.347 "memory_domains": [ 00:20:41.347 { 00:20:41.347 "dma_device_id": "system", 00:20:41.347 "dma_device_type": 1 00:20:41.347 }, 00:20:41.347 { 00:20:41.347 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:41.347 "dma_device_type": 2 00:20:41.347 } 00:20:41.347 ], 00:20:41.347 "driver_specific": {} 00:20:41.347 } 00:20:41.347 ] 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@906 -- # return 0 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@335 -- # verify_raid_bdev_state Existed_Raid online raid1 0 4 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:41.347 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:20:41.606 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:41.606 "name": "Existed_Raid", 00:20:41.606 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:41.606 "strip_size_kb": 0, 00:20:41.606 "state": "online", 00:20:41.606 "raid_level": "raid1", 00:20:41.606 "superblock": true, 00:20:41.606 "num_base_bdevs": 4, 00:20:41.606 "num_base_bdevs_discovered": 4, 00:20:41.606 "num_base_bdevs_operational": 4, 00:20:41.606 "base_bdevs_list": [ 00:20:41.606 { 00:20:41.606 "name": "NewBaseBdev", 00:20:41.606 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:41.606 "is_configured": true, 00:20:41.606 "data_offset": 2048, 00:20:41.606 "data_size": 63488 00:20:41.606 }, 00:20:41.606 { 00:20:41.606 "name": "BaseBdev2", 00:20:41.606 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:41.606 "is_configured": true, 00:20:41.606 "data_offset": 2048, 00:20:41.606 "data_size": 63488 00:20:41.606 }, 00:20:41.606 { 00:20:41.606 "name": "BaseBdev3", 00:20:41.606 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:41.606 "is_configured": true, 00:20:41.606 "data_offset": 2048, 00:20:41.606 "data_size": 63488 00:20:41.606 }, 00:20:41.606 { 00:20:41.606 "name": "BaseBdev4", 00:20:41.606 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:41.606 "is_configured": true, 00:20:41.606 "data_offset": 2048, 00:20:41.606 "data_size": 63488 00:20:41.606 } 00:20:41.606 ] 00:20:41.606 }' 00:20:41.606 15:59:46 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:41.606 15:59:46 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@336 -- # verify_raid_bdev_properties Existed_Raid 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@198 -- # local name 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:20:42.174 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:42.433 [2024-06-10 15:59:47.703743] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:42.433 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:42.433 "name": "Existed_Raid", 00:20:42.433 "aliases": [ 00:20:42.433 "18a4c504-216d-4203-bb31-7c029f283b9c" 00:20:42.433 ], 00:20:42.433 "product_name": "Raid Volume", 00:20:42.433 "block_size": 512, 00:20:42.433 "num_blocks": 63488, 00:20:42.433 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:42.433 "assigned_rate_limits": { 00:20:42.433 "rw_ios_per_sec": 0, 00:20:42.433 "rw_mbytes_per_sec": 0, 00:20:42.433 "r_mbytes_per_sec": 0, 00:20:42.433 "w_mbytes_per_sec": 0 00:20:42.433 }, 00:20:42.433 "claimed": false, 00:20:42.434 "zoned": false, 00:20:42.434 "supported_io_types": { 00:20:42.434 "read": true, 00:20:42.434 "write": true, 00:20:42.434 "unmap": false, 00:20:42.434 "write_zeroes": true, 00:20:42.434 "flush": false, 00:20:42.434 "reset": true, 00:20:42.434 "compare": false, 00:20:42.434 "compare_and_write": false, 00:20:42.434 "abort": false, 00:20:42.434 "nvme_admin": false, 00:20:42.434 "nvme_io": false 00:20:42.434 }, 00:20:42.434 "memory_domains": [ 00:20:42.434 { 00:20:42.434 "dma_device_id": "system", 00:20:42.434 "dma_device_type": 1 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.434 "dma_device_type": 2 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "system", 00:20:42.434 "dma_device_type": 1 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.434 "dma_device_type": 2 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "system", 00:20:42.434 "dma_device_type": 1 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.434 "dma_device_type": 2 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "system", 00:20:42.434 "dma_device_type": 1 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.434 "dma_device_type": 2 00:20:42.434 } 00:20:42.434 ], 00:20:42.434 "driver_specific": { 00:20:42.434 "raid": { 00:20:42.434 "uuid": "18a4c504-216d-4203-bb31-7c029f283b9c", 00:20:42.434 "strip_size_kb": 0, 00:20:42.434 "state": "online", 00:20:42.434 "raid_level": "raid1", 00:20:42.434 "superblock": true, 00:20:42.434 "num_base_bdevs": 4, 00:20:42.434 "num_base_bdevs_discovered": 4, 00:20:42.434 "num_base_bdevs_operational": 4, 00:20:42.434 "base_bdevs_list": [ 00:20:42.434 { 00:20:42.434 "name": "NewBaseBdev", 00:20:42.434 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:42.434 "is_configured": true, 00:20:42.434 "data_offset": 2048, 00:20:42.434 "data_size": 63488 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "name": "BaseBdev2", 00:20:42.434 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:42.434 "is_configured": true, 00:20:42.434 "data_offset": 2048, 00:20:42.434 "data_size": 63488 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "name": "BaseBdev3", 00:20:42.434 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:42.434 "is_configured": true, 00:20:42.434 "data_offset": 2048, 00:20:42.434 "data_size": 63488 00:20:42.434 }, 00:20:42.434 { 00:20:42.434 "name": "BaseBdev4", 00:20:42.434 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:42.434 "is_configured": true, 00:20:42.434 "data_offset": 2048, 00:20:42.434 "data_size": 63488 00:20:42.434 } 00:20:42.434 ] 00:20:42.434 } 00:20:42.434 } 00:20:42.434 }' 00:20:42.434 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:42.434 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@201 -- # base_bdev_names='NewBaseBdev 00:20:42.434 BaseBdev2 00:20:42.434 BaseBdev3 00:20:42.434 BaseBdev4' 00:20:42.434 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.434 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b NewBaseBdev 00:20:42.434 15:59:47 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:42.693 "name": "NewBaseBdev", 00:20:42.693 "aliases": [ 00:20:42.693 "18aff003-3b26-4a25-9068-dd56d2b11de2" 00:20:42.693 ], 00:20:42.693 "product_name": "Malloc disk", 00:20:42.693 "block_size": 512, 00:20:42.693 "num_blocks": 65536, 00:20:42.693 "uuid": "18aff003-3b26-4a25-9068-dd56d2b11de2", 00:20:42.693 "assigned_rate_limits": { 00:20:42.693 "rw_ios_per_sec": 0, 00:20:42.693 "rw_mbytes_per_sec": 0, 00:20:42.693 "r_mbytes_per_sec": 0, 00:20:42.693 "w_mbytes_per_sec": 0 00:20:42.693 }, 00:20:42.693 "claimed": true, 00:20:42.693 "claim_type": "exclusive_write", 00:20:42.693 "zoned": false, 00:20:42.693 "supported_io_types": { 00:20:42.693 "read": true, 00:20:42.693 "write": true, 00:20:42.693 "unmap": true, 00:20:42.693 "write_zeroes": true, 00:20:42.693 "flush": true, 00:20:42.693 "reset": true, 00:20:42.693 "compare": false, 00:20:42.693 "compare_and_write": false, 00:20:42.693 "abort": true, 00:20:42.693 "nvme_admin": false, 00:20:42.693 "nvme_io": false 00:20:42.693 }, 00:20:42.693 "memory_domains": [ 00:20:42.693 { 00:20:42.693 "dma_device_id": "system", 00:20:42.693 "dma_device_type": 1 00:20:42.693 }, 00:20:42.693 { 00:20:42.693 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:42.693 "dma_device_type": 2 00:20:42.693 } 00:20:42.693 ], 00:20:42.693 "driver_specific": {} 00:20:42.693 }' 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:42.693 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.952 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:42.952 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:20:42.953 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.211 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.211 "name": "BaseBdev2", 00:20:43.211 "aliases": [ 00:20:43.211 "d80941a2-6fc9-46b1-885a-d26e915d696d" 00:20:43.211 ], 00:20:43.211 "product_name": "Malloc disk", 00:20:43.211 "block_size": 512, 00:20:43.211 "num_blocks": 65536, 00:20:43.211 "uuid": "d80941a2-6fc9-46b1-885a-d26e915d696d", 00:20:43.211 "assigned_rate_limits": { 00:20:43.211 "rw_ios_per_sec": 0, 00:20:43.211 "rw_mbytes_per_sec": 0, 00:20:43.212 "r_mbytes_per_sec": 0, 00:20:43.212 "w_mbytes_per_sec": 0 00:20:43.212 }, 00:20:43.212 "claimed": true, 00:20:43.212 "claim_type": "exclusive_write", 00:20:43.212 "zoned": false, 00:20:43.212 "supported_io_types": { 00:20:43.212 "read": true, 00:20:43.212 "write": true, 00:20:43.212 "unmap": true, 00:20:43.212 "write_zeroes": true, 00:20:43.212 "flush": true, 00:20:43.212 "reset": true, 00:20:43.212 "compare": false, 00:20:43.212 "compare_and_write": false, 00:20:43.212 "abort": true, 00:20:43.212 "nvme_admin": false, 00:20:43.212 "nvme_io": false 00:20:43.212 }, 00:20:43.212 "memory_domains": [ 00:20:43.212 { 00:20:43.212 "dma_device_id": "system", 00:20:43.212 "dma_device_type": 1 00:20:43.212 }, 00:20:43.212 { 00:20:43.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.212 "dma_device_type": 2 00:20:43.212 } 00:20:43.212 ], 00:20:43.212 "driver_specific": {} 00:20:43.212 }' 00:20:43.212 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.212 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.212 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.212 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:43.471 15:59:48 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev3 00:20:43.729 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:43.729 "name": "BaseBdev3", 00:20:43.729 "aliases": [ 00:20:43.729 "7d2d7dca-469b-48e1-ae57-e1bd571c4e42" 00:20:43.730 ], 00:20:43.730 "product_name": "Malloc disk", 00:20:43.730 "block_size": 512, 00:20:43.730 "num_blocks": 65536, 00:20:43.730 "uuid": "7d2d7dca-469b-48e1-ae57-e1bd571c4e42", 00:20:43.730 "assigned_rate_limits": { 00:20:43.730 "rw_ios_per_sec": 0, 00:20:43.730 "rw_mbytes_per_sec": 0, 00:20:43.730 "r_mbytes_per_sec": 0, 00:20:43.730 "w_mbytes_per_sec": 0 00:20:43.730 }, 00:20:43.730 "claimed": true, 00:20:43.730 "claim_type": "exclusive_write", 00:20:43.730 "zoned": false, 00:20:43.730 "supported_io_types": { 00:20:43.730 "read": true, 00:20:43.730 "write": true, 00:20:43.730 "unmap": true, 00:20:43.730 "write_zeroes": true, 00:20:43.730 "flush": true, 00:20:43.730 "reset": true, 00:20:43.730 "compare": false, 00:20:43.730 "compare_and_write": false, 00:20:43.730 "abort": true, 00:20:43.730 "nvme_admin": false, 00:20:43.730 "nvme_io": false 00:20:43.730 }, 00:20:43.730 "memory_domains": [ 00:20:43.730 { 00:20:43.730 "dma_device_id": "system", 00:20:43.730 "dma_device_type": 1 00:20:43.730 }, 00:20:43.730 { 00:20:43.730 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:43.730 "dma_device_type": 2 00:20:43.730 } 00:20:43.730 ], 00:20:43.730 "driver_specific": {} 00:20:43.730 }' 00:20:43.730 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:43.989 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.248 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.248 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:44.248 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:44.248 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:44.248 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev4 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:44.507 "name": "BaseBdev4", 00:20:44.507 "aliases": [ 00:20:44.507 "6d2f403c-9a37-4356-881b-5c785b6aa1a2" 00:20:44.507 ], 00:20:44.507 "product_name": "Malloc disk", 00:20:44.507 "block_size": 512, 00:20:44.507 "num_blocks": 65536, 00:20:44.507 "uuid": "6d2f403c-9a37-4356-881b-5c785b6aa1a2", 00:20:44.507 "assigned_rate_limits": { 00:20:44.507 "rw_ios_per_sec": 0, 00:20:44.507 "rw_mbytes_per_sec": 0, 00:20:44.507 "r_mbytes_per_sec": 0, 00:20:44.507 "w_mbytes_per_sec": 0 00:20:44.507 }, 00:20:44.507 "claimed": true, 00:20:44.507 "claim_type": "exclusive_write", 00:20:44.507 "zoned": false, 00:20:44.507 "supported_io_types": { 00:20:44.507 "read": true, 00:20:44.507 "write": true, 00:20:44.507 "unmap": true, 00:20:44.507 "write_zeroes": true, 00:20:44.507 "flush": true, 00:20:44.507 "reset": true, 00:20:44.507 "compare": false, 00:20:44.507 "compare_and_write": false, 00:20:44.507 "abort": true, 00:20:44.507 "nvme_admin": false, 00:20:44.507 "nvme_io": false 00:20:44.507 }, 00:20:44.507 "memory_domains": [ 00:20:44.507 { 00:20:44.507 "dma_device_id": "system", 00:20:44.507 "dma_device_type": 1 00:20:44.507 }, 00:20:44.507 { 00:20:44.507 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:44.507 "dma_device_type": 2 00:20:44.507 } 00:20:44.507 ], 00:20:44.507 "driver_specific": {} 00:20:44.507 }' 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:44.507 15:59:49 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:44.766 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@338 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:20:45.026 [2024-06-10 15:59:50.398667] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:20:45.026 [2024-06-10 15:59:50.398691] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:45.026 [2024-06-10 15:59:50.398738] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:45.026 [2024-06-10 15:59:50.399024] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:45.026 [2024-06-10 15:59:50.399034] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa85510 name Existed_Raid, state offline 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@341 -- # killprocess 2748934 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2748934 ']' 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@953 -- # kill -0 2748934 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # uname 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2748934 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2748934' 00:20:45.026 killing process with pid 2748934 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@968 -- # kill 2748934 00:20:45.026 [2024-06-10 15:59:50.465444] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:20:45.026 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@973 -- # wait 2748934 00:20:45.026 [2024-06-10 15:59:50.499166] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:20:45.286 15:59:50 bdev_raid.raid_state_function_test_sb -- bdev/bdev_raid.sh@343 -- # return 0 00:20:45.286 00:20:45.286 real 0m33.435s 00:20:45.286 user 1m2.687s 00:20:45.286 sys 0m4.662s 00:20:45.286 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:20:45.286 15:59:50 bdev_raid.raid_state_function_test_sb -- common/autotest_common.sh@10 -- # set +x 00:20:45.286 ************************************ 00:20:45.286 END TEST raid_state_function_test_sb 00:20:45.286 ************************************ 00:20:45.286 15:59:50 bdev_raid -- bdev/bdev_raid.sh@869 -- # run_test raid_superblock_test raid_superblock_test raid1 4 00:20:45.286 15:59:50 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:20:45.286 15:59:50 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:20:45.286 15:59:50 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:20:45.286 ************************************ 00:20:45.286 START TEST raid_superblock_test 00:20:45.286 ************************************ 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 4 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=4 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@398 -- # local strip_size 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@411 -- # raid_pid=2754953 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@412 -- # waitforlisten 2754953 /var/tmp/spdk-raid.sock 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@830 -- # '[' -z 2754953 ']' 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:20:45.286 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:20:45.286 15:59:50 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:45.579 [2024-06-10 15:59:50.820775] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:20:45.579 [2024-06-10 15:59:50.820830] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2754953 ] 00:20:45.579 [2024-06-10 15:59:50.918845] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:20:45.579 [2024-06-10 15:59:51.014449] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:20:45.579 [2024-06-10 15:59:51.074670] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:45.579 [2024-06-10 15:59:51.074702] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- common/autotest_common.sh@863 -- # return 0 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:46.515 15:59:51 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc1 00:20:46.515 malloc1 00:20:46.773 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:46.773 [2024-06-10 15:59:52.268067] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:46.773 [2024-06-10 15:59:52.268110] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:46.773 [2024-06-10 15:59:52.268128] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab80f0 00:20:46.773 [2024-06-10 15:59:52.268138] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:46.773 [2024-06-10 15:59:52.269886] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:46.773 [2024-06-10 15:59:52.269915] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:46.773 pt1 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:47.031 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc2 00:20:47.031 malloc2 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:47.290 [2024-06-10 15:59:52.685854] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:47.290 [2024-06-10 15:59:52.685896] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.290 [2024-06-10 15:59:52.685910] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab9400 00:20:47.290 [2024-06-10 15:59:52.685919] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.290 [2024-06-10 15:59:52.687468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.290 [2024-06-10 15:59:52.687494] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:47.290 pt2 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc3 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt3 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000003 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:47.290 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc3 00:20:47.548 malloc3 00:20:47.548 15:59:52 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:47.807 [2024-06-10 15:59:53.195659] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:47.807 [2024-06-10 15:59:53.195702] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:47.807 [2024-06-10 15:59:53.195717] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c65200 00:20:47.807 [2024-06-10 15:59:53.195726] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:47.807 [2024-06-10 15:59:53.197285] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:47.807 [2024-06-10 15:59:53.197312] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:47.807 pt3 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc4 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt4 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000004 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:20:47.807 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b malloc4 00:20:48.066 malloc4 00:20:48.066 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:48.325 [2024-06-10 15:59:53.705595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:48.325 [2024-06-10 15:59:53.705636] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:48.325 [2024-06-10 15:59:53.705650] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c67320 00:20:48.325 [2024-06-10 15:59:53.705660] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:48.325 [2024-06-10 15:59:53.707181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:48.325 [2024-06-10 15:59:53.707208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:48.326 pt4 00:20:48.326 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:20:48.326 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:20:48.326 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2 pt3 pt4' -n raid_bdev1 -s 00:20:48.585 [2024-06-10 15:59:53.954273] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:48.585 [2024-06-10 15:59:53.955627] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:48.585 [2024-06-10 15:59:53.955684] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:48.585 [2024-06-10 15:59:53.955730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:48.585 [2024-06-10 15:59:53.955913] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c69660 00:20:48.585 [2024-06-10 15:59:53.955924] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:48.585 [2024-06-10 15:59:53.956141] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1acefe0 00:20:48.585 [2024-06-10 15:59:53.956296] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c69660 00:20:48.585 [2024-06-10 15:59:53.956304] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c69660 00:20:48.585 [2024-06-10 15:59:53.956407] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:48.585 15:59:53 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:48.844 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:48.844 "name": "raid_bdev1", 00:20:48.844 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:48.844 "strip_size_kb": 0, 00:20:48.844 "state": "online", 00:20:48.844 "raid_level": "raid1", 00:20:48.844 "superblock": true, 00:20:48.844 "num_base_bdevs": 4, 00:20:48.844 "num_base_bdevs_discovered": 4, 00:20:48.844 "num_base_bdevs_operational": 4, 00:20:48.844 "base_bdevs_list": [ 00:20:48.844 { 00:20:48.844 "name": "pt1", 00:20:48.844 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:48.844 "is_configured": true, 00:20:48.844 "data_offset": 2048, 00:20:48.844 "data_size": 63488 00:20:48.844 }, 00:20:48.844 { 00:20:48.844 "name": "pt2", 00:20:48.844 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:48.844 "is_configured": true, 00:20:48.844 "data_offset": 2048, 00:20:48.844 "data_size": 63488 00:20:48.844 }, 00:20:48.844 { 00:20:48.844 "name": "pt3", 00:20:48.844 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:48.844 "is_configured": true, 00:20:48.844 "data_offset": 2048, 00:20:48.844 "data_size": 63488 00:20:48.844 }, 00:20:48.844 { 00:20:48.844 "name": "pt4", 00:20:48.844 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:48.844 "is_configured": true, 00:20:48.844 "data_offset": 2048, 00:20:48.844 "data_size": 63488 00:20:48.844 } 00:20:48.844 ] 00:20:48.844 }' 00:20:48.844 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:48.844 15:59:54 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:49.411 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:49.412 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:49.671 [2024-06-10 15:59:54.945193] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:49.671 "name": "raid_bdev1", 00:20:49.671 "aliases": [ 00:20:49.671 "035bf9bc-9051-4994-a30b-08c1df0a6979" 00:20:49.671 ], 00:20:49.671 "product_name": "Raid Volume", 00:20:49.671 "block_size": 512, 00:20:49.671 "num_blocks": 63488, 00:20:49.671 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:49.671 "assigned_rate_limits": { 00:20:49.671 "rw_ios_per_sec": 0, 00:20:49.671 "rw_mbytes_per_sec": 0, 00:20:49.671 "r_mbytes_per_sec": 0, 00:20:49.671 "w_mbytes_per_sec": 0 00:20:49.671 }, 00:20:49.671 "claimed": false, 00:20:49.671 "zoned": false, 00:20:49.671 "supported_io_types": { 00:20:49.671 "read": true, 00:20:49.671 "write": true, 00:20:49.671 "unmap": false, 00:20:49.671 "write_zeroes": true, 00:20:49.671 "flush": false, 00:20:49.671 "reset": true, 00:20:49.671 "compare": false, 00:20:49.671 "compare_and_write": false, 00:20:49.671 "abort": false, 00:20:49.671 "nvme_admin": false, 00:20:49.671 "nvme_io": false 00:20:49.671 }, 00:20:49.671 "memory_domains": [ 00:20:49.671 { 00:20:49.671 "dma_device_id": "system", 00:20:49.671 "dma_device_type": 1 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.671 "dma_device_type": 2 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "system", 00:20:49.671 "dma_device_type": 1 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.671 "dma_device_type": 2 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "system", 00:20:49.671 "dma_device_type": 1 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.671 "dma_device_type": 2 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "system", 00:20:49.671 "dma_device_type": 1 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.671 "dma_device_type": 2 00:20:49.671 } 00:20:49.671 ], 00:20:49.671 "driver_specific": { 00:20:49.671 "raid": { 00:20:49.671 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:49.671 "strip_size_kb": 0, 00:20:49.671 "state": "online", 00:20:49.671 "raid_level": "raid1", 00:20:49.671 "superblock": true, 00:20:49.671 "num_base_bdevs": 4, 00:20:49.671 "num_base_bdevs_discovered": 4, 00:20:49.671 "num_base_bdevs_operational": 4, 00:20:49.671 "base_bdevs_list": [ 00:20:49.671 { 00:20:49.671 "name": "pt1", 00:20:49.671 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:49.671 "is_configured": true, 00:20:49.671 "data_offset": 2048, 00:20:49.671 "data_size": 63488 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "name": "pt2", 00:20:49.671 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:49.671 "is_configured": true, 00:20:49.671 "data_offset": 2048, 00:20:49.671 "data_size": 63488 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "name": "pt3", 00:20:49.671 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:49.671 "is_configured": true, 00:20:49.671 "data_offset": 2048, 00:20:49.671 "data_size": 63488 00:20:49.671 }, 00:20:49.671 { 00:20:49.671 "name": "pt4", 00:20:49.671 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:49.671 "is_configured": true, 00:20:49.671 "data_offset": 2048, 00:20:49.671 "data_size": 63488 00:20:49.671 } 00:20:49.671 ] 00:20:49.671 } 00:20:49.671 } 00:20:49.671 }' 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:49.671 pt2 00:20:49.671 pt3 00:20:49.671 pt4' 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:49.671 15:59:54 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:49.930 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:49.930 "name": "pt1", 00:20:49.930 "aliases": [ 00:20:49.930 "00000000-0000-0000-0000-000000000001" 00:20:49.930 ], 00:20:49.930 "product_name": "passthru", 00:20:49.930 "block_size": 512, 00:20:49.930 "num_blocks": 65536, 00:20:49.930 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:49.930 "assigned_rate_limits": { 00:20:49.930 "rw_ios_per_sec": 0, 00:20:49.930 "rw_mbytes_per_sec": 0, 00:20:49.930 "r_mbytes_per_sec": 0, 00:20:49.930 "w_mbytes_per_sec": 0 00:20:49.930 }, 00:20:49.930 "claimed": true, 00:20:49.930 "claim_type": "exclusive_write", 00:20:49.930 "zoned": false, 00:20:49.930 "supported_io_types": { 00:20:49.930 "read": true, 00:20:49.930 "write": true, 00:20:49.930 "unmap": true, 00:20:49.930 "write_zeroes": true, 00:20:49.930 "flush": true, 00:20:49.930 "reset": true, 00:20:49.930 "compare": false, 00:20:49.930 "compare_and_write": false, 00:20:49.930 "abort": true, 00:20:49.930 "nvme_admin": false, 00:20:49.930 "nvme_io": false 00:20:49.930 }, 00:20:49.930 "memory_domains": [ 00:20:49.930 { 00:20:49.930 "dma_device_id": "system", 00:20:49.930 "dma_device_type": 1 00:20:49.930 }, 00:20:49.930 { 00:20:49.930 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:49.930 "dma_device_type": 2 00:20:49.930 } 00:20:49.930 ], 00:20:49.930 "driver_specific": { 00:20:49.930 "passthru": { 00:20:49.930 "name": "pt1", 00:20:49.930 "base_bdev_name": "malloc1" 00:20:49.930 } 00:20:49.930 } 00:20:49.931 }' 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:49.931 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:50.189 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:50.447 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:50.447 "name": "pt2", 00:20:50.447 "aliases": [ 00:20:50.447 "00000000-0000-0000-0000-000000000002" 00:20:50.447 ], 00:20:50.447 "product_name": "passthru", 00:20:50.447 "block_size": 512, 00:20:50.448 "num_blocks": 65536, 00:20:50.448 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:50.448 "assigned_rate_limits": { 00:20:50.448 "rw_ios_per_sec": 0, 00:20:50.448 "rw_mbytes_per_sec": 0, 00:20:50.448 "r_mbytes_per_sec": 0, 00:20:50.448 "w_mbytes_per_sec": 0 00:20:50.448 }, 00:20:50.448 "claimed": true, 00:20:50.448 "claim_type": "exclusive_write", 00:20:50.448 "zoned": false, 00:20:50.448 "supported_io_types": { 00:20:50.448 "read": true, 00:20:50.448 "write": true, 00:20:50.448 "unmap": true, 00:20:50.448 "write_zeroes": true, 00:20:50.448 "flush": true, 00:20:50.448 "reset": true, 00:20:50.448 "compare": false, 00:20:50.448 "compare_and_write": false, 00:20:50.448 "abort": true, 00:20:50.448 "nvme_admin": false, 00:20:50.448 "nvme_io": false 00:20:50.448 }, 00:20:50.448 "memory_domains": [ 00:20:50.448 { 00:20:50.448 "dma_device_id": "system", 00:20:50.448 "dma_device_type": 1 00:20:50.448 }, 00:20:50.448 { 00:20:50.448 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.448 "dma_device_type": 2 00:20:50.448 } 00:20:50.448 ], 00:20:50.448 "driver_specific": { 00:20:50.448 "passthru": { 00:20:50.448 "name": "pt2", 00:20:50.448 "base_bdev_name": "malloc2" 00:20:50.448 } 00:20:50.448 } 00:20:50.448 }' 00:20:50.448 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.448 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.448 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:50.448 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:50.707 15:59:55 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:50.707 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:50.966 "name": "pt3", 00:20:50.966 "aliases": [ 00:20:50.966 "00000000-0000-0000-0000-000000000003" 00:20:50.966 ], 00:20:50.966 "product_name": "passthru", 00:20:50.966 "block_size": 512, 00:20:50.966 "num_blocks": 65536, 00:20:50.966 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:50.966 "assigned_rate_limits": { 00:20:50.966 "rw_ios_per_sec": 0, 00:20:50.966 "rw_mbytes_per_sec": 0, 00:20:50.966 "r_mbytes_per_sec": 0, 00:20:50.966 "w_mbytes_per_sec": 0 00:20:50.966 }, 00:20:50.966 "claimed": true, 00:20:50.966 "claim_type": "exclusive_write", 00:20:50.966 "zoned": false, 00:20:50.966 "supported_io_types": { 00:20:50.966 "read": true, 00:20:50.966 "write": true, 00:20:50.966 "unmap": true, 00:20:50.966 "write_zeroes": true, 00:20:50.966 "flush": true, 00:20:50.966 "reset": true, 00:20:50.966 "compare": false, 00:20:50.966 "compare_and_write": false, 00:20:50.966 "abort": true, 00:20:50.966 "nvme_admin": false, 00:20:50.966 "nvme_io": false 00:20:50.966 }, 00:20:50.966 "memory_domains": [ 00:20:50.966 { 00:20:50.966 "dma_device_id": "system", 00:20:50.966 "dma_device_type": 1 00:20:50.966 }, 00:20:50.966 { 00:20:50.966 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:50.966 "dma_device_type": 2 00:20:50.966 } 00:20:50.966 ], 00:20:50.966 "driver_specific": { 00:20:50.966 "passthru": { 00:20:50.966 "name": "pt3", 00:20:50.966 "base_bdev_name": "malloc3" 00:20:50.966 } 00:20:50.966 } 00:20:50.966 }' 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:50.966 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:20:51.225 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:51.484 "name": "pt4", 00:20:51.484 "aliases": [ 00:20:51.484 "00000000-0000-0000-0000-000000000004" 00:20:51.484 ], 00:20:51.484 "product_name": "passthru", 00:20:51.484 "block_size": 512, 00:20:51.484 "num_blocks": 65536, 00:20:51.484 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:51.484 "assigned_rate_limits": { 00:20:51.484 "rw_ios_per_sec": 0, 00:20:51.484 "rw_mbytes_per_sec": 0, 00:20:51.484 "r_mbytes_per_sec": 0, 00:20:51.484 "w_mbytes_per_sec": 0 00:20:51.484 }, 00:20:51.484 "claimed": true, 00:20:51.484 "claim_type": "exclusive_write", 00:20:51.484 "zoned": false, 00:20:51.484 "supported_io_types": { 00:20:51.484 "read": true, 00:20:51.484 "write": true, 00:20:51.484 "unmap": true, 00:20:51.484 "write_zeroes": true, 00:20:51.484 "flush": true, 00:20:51.484 "reset": true, 00:20:51.484 "compare": false, 00:20:51.484 "compare_and_write": false, 00:20:51.484 "abort": true, 00:20:51.484 "nvme_admin": false, 00:20:51.484 "nvme_io": false 00:20:51.484 }, 00:20:51.484 "memory_domains": [ 00:20:51.484 { 00:20:51.484 "dma_device_id": "system", 00:20:51.484 "dma_device_type": 1 00:20:51.484 }, 00:20:51.484 { 00:20:51.484 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:51.484 "dma_device_type": 2 00:20:51.484 } 00:20:51.484 ], 00:20:51.484 "driver_specific": { 00:20:51.484 "passthru": { 00:20:51.484 "name": "pt4", 00:20:51.484 "base_bdev_name": "malloc4" 00:20:51.484 } 00:20:51.484 } 00:20:51.484 }' 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:51.484 15:59:56 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:51.743 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:20:52.002 [2024-06-10 15:59:57.395857] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:52.002 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=035bf9bc-9051-4994-a30b-08c1df0a6979 00:20:52.002 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@435 -- # '[' -z 035bf9bc-9051-4994-a30b-08c1df0a6979 ']' 00:20:52.002 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:20:52.261 [2024-06-10 15:59:57.652404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:52.261 [2024-06-10 15:59:57.652422] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:20:52.261 [2024-06-10 15:59:57.652470] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:20:52.261 [2024-06-10 15:59:57.652552] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:20:52.261 [2024-06-10 15:59:57.652561] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c69660 name raid_bdev1, state offline 00:20:52.261 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:52.261 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:20:52.520 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:20:52.520 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:20:52.520 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:52.520 15:59:57 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:20:52.779 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:52.779 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:53.038 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:53.038 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:20:53.297 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:20:53.297 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:20:53.557 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:20:53.557 15:59:58 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@649 -- # local es=0 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:20:53.816 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2 malloc3 malloc4' -n raid_bdev1 00:20:54.075 [2024-06-10 15:59:59.433074] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:20:54.075 [2024-06-10 15:59:59.434528] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:20:54.075 [2024-06-10 15:59:59.434572] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc3 is claimed 00:20:54.075 [2024-06-10 15:59:59.434607] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc4 is claimed 00:20:54.075 [2024-06-10 15:59:59.434650] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:20:54.075 [2024-06-10 15:59:59.434685] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:20:54.075 [2024-06-10 15:59:59.434706] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc3 00:20:54.075 [2024-06-10 15:59:59.434725] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc4 00:20:54.075 [2024-06-10 15:59:59.434739] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:20:54.075 [2024-06-10 15:59:59.434748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c67b70 name raid_bdev1, state configuring 00:20:54.075 request: 00:20:54.075 { 00:20:54.075 "name": "raid_bdev1", 00:20:54.075 "raid_level": "raid1", 00:20:54.075 "base_bdevs": [ 00:20:54.075 "malloc1", 00:20:54.075 "malloc2", 00:20:54.075 "malloc3", 00:20:54.075 "malloc4" 00:20:54.075 ], 00:20:54.075 "superblock": false, 00:20:54.075 "method": "bdev_raid_create", 00:20:54.075 "req_id": 1 00:20:54.075 } 00:20:54.075 Got JSON-RPC error response 00:20:54.075 response: 00:20:54.075 { 00:20:54.075 "code": -17, 00:20:54.075 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:20:54.075 } 00:20:54.075 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@652 -- # es=1 00:20:54.075 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:20:54.076 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:20:54.076 15:59:59 bdev_raid.raid_superblock_test -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:20:54.076 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.076 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:20:54.334 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:20:54.334 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:20:54.334 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:20:54.334 [2024-06-10 15:59:59.842115] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:20:54.334 [2024-06-10 15:59:59.842155] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:54.334 [2024-06-10 15:59:59.842171] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c66d20 00:20:54.334 [2024-06-10 15:59:59.842186] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:54.334 [2024-06-10 15:59:59.843842] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:54.334 [2024-06-10 15:59:59.843869] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:20:54.334 [2024-06-10 15:59:59.843929] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:20:54.334 [2024-06-10 15:59:59.843966] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:20:54.593 pt1 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:54.593 15:59:59 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:54.593 16:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:54.593 "name": "raid_bdev1", 00:20:54.593 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:54.593 "strip_size_kb": 0, 00:20:54.593 "state": "configuring", 00:20:54.593 "raid_level": "raid1", 00:20:54.593 "superblock": true, 00:20:54.593 "num_base_bdevs": 4, 00:20:54.593 "num_base_bdevs_discovered": 1, 00:20:54.593 "num_base_bdevs_operational": 4, 00:20:54.593 "base_bdevs_list": [ 00:20:54.593 { 00:20:54.593 "name": "pt1", 00:20:54.593 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:54.593 "is_configured": true, 00:20:54.593 "data_offset": 2048, 00:20:54.593 "data_size": 63488 00:20:54.593 }, 00:20:54.593 { 00:20:54.593 "name": null, 00:20:54.593 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:54.593 "is_configured": false, 00:20:54.593 "data_offset": 2048, 00:20:54.593 "data_size": 63488 00:20:54.593 }, 00:20:54.593 { 00:20:54.593 "name": null, 00:20:54.593 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:54.593 "is_configured": false, 00:20:54.593 "data_offset": 2048, 00:20:54.593 "data_size": 63488 00:20:54.593 }, 00:20:54.593 { 00:20:54.593 "name": null, 00:20:54.593 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:54.593 "is_configured": false, 00:20:54.593 "data_offset": 2048, 00:20:54.593 "data_size": 63488 00:20:54.593 } 00:20:54.593 ] 00:20:54.593 }' 00:20:54.593 16:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:54.593 16:00:00 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:55.161 16:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@469 -- # '[' 4 -gt 2 ']' 00:20:55.162 16:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@471 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:55.420 [2024-06-10 16:00:00.864866] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:55.420 [2024-06-10 16:00:00.864910] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:55.420 [2024-06-10 16:00:00.864925] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c62b50 00:20:55.420 [2024-06-10 16:00:00.864935] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:55.420 [2024-06-10 16:00:00.865278] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:55.420 [2024-06-10 16:00:00.865295] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:55.420 [2024-06-10 16:00:00.865356] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:55.420 [2024-06-10 16:00:00.865373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:55.420 pt2 00:20:55.420 16:00:00 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@472 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:20:55.678 [2024-06-10 16:00:01.121574] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt2 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@473 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 4 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:55.678 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:55.936 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:55.936 "name": "raid_bdev1", 00:20:55.936 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:55.936 "strip_size_kb": 0, 00:20:55.936 "state": "configuring", 00:20:55.936 "raid_level": "raid1", 00:20:55.936 "superblock": true, 00:20:55.936 "num_base_bdevs": 4, 00:20:55.936 "num_base_bdevs_discovered": 1, 00:20:55.936 "num_base_bdevs_operational": 4, 00:20:55.936 "base_bdevs_list": [ 00:20:55.936 { 00:20:55.936 "name": "pt1", 00:20:55.936 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:55.936 "is_configured": true, 00:20:55.936 "data_offset": 2048, 00:20:55.936 "data_size": 63488 00:20:55.936 }, 00:20:55.936 { 00:20:55.936 "name": null, 00:20:55.936 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:55.936 "is_configured": false, 00:20:55.936 "data_offset": 2048, 00:20:55.936 "data_size": 63488 00:20:55.936 }, 00:20:55.936 { 00:20:55.936 "name": null, 00:20:55.936 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:55.936 "is_configured": false, 00:20:55.936 "data_offset": 2048, 00:20:55.936 "data_size": 63488 00:20:55.936 }, 00:20:55.936 { 00:20:55.936 "name": null, 00:20:55.936 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:55.936 "is_configured": false, 00:20:55.936 "data_offset": 2048, 00:20:55.936 "data_size": 63488 00:20:55.936 } 00:20:55.936 ] 00:20:55.936 }' 00:20:55.936 16:00:01 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:55.936 16:00:01 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:20:56.868 [2024-06-10 16:00:02.264756] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:20:56.868 [2024-06-10 16:00:02.264805] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:56.868 [2024-06-10 16:00:02.264821] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c663c0 00:20:56.868 [2024-06-10 16:00:02.264831] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:56.868 [2024-06-10 16:00:02.265179] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:56.868 [2024-06-10 16:00:02.265196] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:20:56.868 [2024-06-10 16:00:02.265264] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:20:56.868 [2024-06-10 16:00:02.265281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:20:56.868 pt2 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:56.868 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:20:57.128 [2024-06-10 16:00:02.525439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:20:57.128 [2024-06-10 16:00:02.525466] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.128 [2024-06-10 16:00:02.525478] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c62510 00:20:57.128 [2024-06-10 16:00:02.525487] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.128 [2024-06-10 16:00:02.525774] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.128 [2024-06-10 16:00:02.525789] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:20:57.128 [2024-06-10 16:00:02.525838] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:20:57.128 [2024-06-10 16:00:02.525854] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:20:57.128 pt3 00:20:57.128 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:57.128 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:57.128 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:20:57.416 [2024-06-10 16:00:02.782124] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:20:57.416 [2024-06-10 16:00:02.782153] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:20:57.416 [2024-06-10 16:00:02.782165] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c68940 00:20:57.416 [2024-06-10 16:00:02.782174] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:20:57.416 [2024-06-10 16:00:02.782450] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:20:57.416 [2024-06-10 16:00:02.782465] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:20:57.416 [2024-06-10 16:00:02.782511] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:20:57.416 [2024-06-10 16:00:02.782526] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:20:57.416 [2024-06-10 16:00:02.782644] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c62e70 00:20:57.416 [2024-06-10 16:00:02.782653] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:20:57.416 [2024-06-10 16:00:02.782822] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ab8c70 00:20:57.416 [2024-06-10 16:00:02.782971] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c62e70 00:20:57.416 [2024-06-10 16:00:02.782980] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c62e70 00:20:57.416 [2024-06-10 16:00:02.783078] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:20:57.416 pt4 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:20:57.416 16:00:02 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:20:57.675 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:20:57.675 "name": "raid_bdev1", 00:20:57.675 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:57.675 "strip_size_kb": 0, 00:20:57.675 "state": "online", 00:20:57.675 "raid_level": "raid1", 00:20:57.675 "superblock": true, 00:20:57.675 "num_base_bdevs": 4, 00:20:57.675 "num_base_bdevs_discovered": 4, 00:20:57.675 "num_base_bdevs_operational": 4, 00:20:57.675 "base_bdevs_list": [ 00:20:57.675 { 00:20:57.675 "name": "pt1", 00:20:57.675 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:57.675 "is_configured": true, 00:20:57.675 "data_offset": 2048, 00:20:57.675 "data_size": 63488 00:20:57.675 }, 00:20:57.675 { 00:20:57.675 "name": "pt2", 00:20:57.675 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:57.675 "is_configured": true, 00:20:57.675 "data_offset": 2048, 00:20:57.675 "data_size": 63488 00:20:57.675 }, 00:20:57.675 { 00:20:57.675 "name": "pt3", 00:20:57.675 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:57.675 "is_configured": true, 00:20:57.675 "data_offset": 2048, 00:20:57.675 "data_size": 63488 00:20:57.675 }, 00:20:57.675 { 00:20:57.675 "name": "pt4", 00:20:57.675 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:57.675 "is_configured": true, 00:20:57.675 "data_offset": 2048, 00:20:57.675 "data_size": 63488 00:20:57.675 } 00:20:57.675 ] 00:20:57.675 }' 00:20:57.675 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:20:57.675 16:00:03 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@198 -- # local name 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:20:58.240 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:20:58.499 [2024-06-10 16:00:03.913450] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:20:58.499 "name": "raid_bdev1", 00:20:58.499 "aliases": [ 00:20:58.499 "035bf9bc-9051-4994-a30b-08c1df0a6979" 00:20:58.499 ], 00:20:58.499 "product_name": "Raid Volume", 00:20:58.499 "block_size": 512, 00:20:58.499 "num_blocks": 63488, 00:20:58.499 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:58.499 "assigned_rate_limits": { 00:20:58.499 "rw_ios_per_sec": 0, 00:20:58.499 "rw_mbytes_per_sec": 0, 00:20:58.499 "r_mbytes_per_sec": 0, 00:20:58.499 "w_mbytes_per_sec": 0 00:20:58.499 }, 00:20:58.499 "claimed": false, 00:20:58.499 "zoned": false, 00:20:58.499 "supported_io_types": { 00:20:58.499 "read": true, 00:20:58.499 "write": true, 00:20:58.499 "unmap": false, 00:20:58.499 "write_zeroes": true, 00:20:58.499 "flush": false, 00:20:58.499 "reset": true, 00:20:58.499 "compare": false, 00:20:58.499 "compare_and_write": false, 00:20:58.499 "abort": false, 00:20:58.499 "nvme_admin": false, 00:20:58.499 "nvme_io": false 00:20:58.499 }, 00:20:58.499 "memory_domains": [ 00:20:58.499 { 00:20:58.499 "dma_device_id": "system", 00:20:58.499 "dma_device_type": 1 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.499 "dma_device_type": 2 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "system", 00:20:58.499 "dma_device_type": 1 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.499 "dma_device_type": 2 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "system", 00:20:58.499 "dma_device_type": 1 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.499 "dma_device_type": 2 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "system", 00:20:58.499 "dma_device_type": 1 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.499 "dma_device_type": 2 00:20:58.499 } 00:20:58.499 ], 00:20:58.499 "driver_specific": { 00:20:58.499 "raid": { 00:20:58.499 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:20:58.499 "strip_size_kb": 0, 00:20:58.499 "state": "online", 00:20:58.499 "raid_level": "raid1", 00:20:58.499 "superblock": true, 00:20:58.499 "num_base_bdevs": 4, 00:20:58.499 "num_base_bdevs_discovered": 4, 00:20:58.499 "num_base_bdevs_operational": 4, 00:20:58.499 "base_bdevs_list": [ 00:20:58.499 { 00:20:58.499 "name": "pt1", 00:20:58.499 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:58.499 "is_configured": true, 00:20:58.499 "data_offset": 2048, 00:20:58.499 "data_size": 63488 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "name": "pt2", 00:20:58.499 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:58.499 "is_configured": true, 00:20:58.499 "data_offset": 2048, 00:20:58.499 "data_size": 63488 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "name": "pt3", 00:20:58.499 "uuid": "00000000-0000-0000-0000-000000000003", 00:20:58.499 "is_configured": true, 00:20:58.499 "data_offset": 2048, 00:20:58.499 "data_size": 63488 00:20:58.499 }, 00:20:58.499 { 00:20:58.499 "name": "pt4", 00:20:58.499 "uuid": "00000000-0000-0000-0000-000000000004", 00:20:58.499 "is_configured": true, 00:20:58.499 "data_offset": 2048, 00:20:58.499 "data_size": 63488 00:20:58.499 } 00:20:58.499 ] 00:20:58.499 } 00:20:58.499 } 00:20:58.499 }' 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:20:58.499 pt2 00:20:58.499 pt3 00:20:58.499 pt4' 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:20:58.499 16:00:03 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:58.758 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:58.758 "name": "pt1", 00:20:58.758 "aliases": [ 00:20:58.758 "00000000-0000-0000-0000-000000000001" 00:20:58.758 ], 00:20:58.758 "product_name": "passthru", 00:20:58.758 "block_size": 512, 00:20:58.758 "num_blocks": 65536, 00:20:58.758 "uuid": "00000000-0000-0000-0000-000000000001", 00:20:58.758 "assigned_rate_limits": { 00:20:58.758 "rw_ios_per_sec": 0, 00:20:58.758 "rw_mbytes_per_sec": 0, 00:20:58.758 "r_mbytes_per_sec": 0, 00:20:58.758 "w_mbytes_per_sec": 0 00:20:58.758 }, 00:20:58.758 "claimed": true, 00:20:58.758 "claim_type": "exclusive_write", 00:20:58.758 "zoned": false, 00:20:58.758 "supported_io_types": { 00:20:58.758 "read": true, 00:20:58.758 "write": true, 00:20:58.758 "unmap": true, 00:20:58.758 "write_zeroes": true, 00:20:58.758 "flush": true, 00:20:58.758 "reset": true, 00:20:58.758 "compare": false, 00:20:58.758 "compare_and_write": false, 00:20:58.758 "abort": true, 00:20:58.758 "nvme_admin": false, 00:20:58.758 "nvme_io": false 00:20:58.758 }, 00:20:58.758 "memory_domains": [ 00:20:58.758 { 00:20:58.758 "dma_device_id": "system", 00:20:58.758 "dma_device_type": 1 00:20:58.758 }, 00:20:58.758 { 00:20:58.758 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:58.758 "dma_device_type": 2 00:20:58.758 } 00:20:58.758 ], 00:20:58.758 "driver_specific": { 00:20:58.758 "passthru": { 00:20:58.758 "name": "pt1", 00:20:58.758 "base_bdev_name": "malloc1" 00:20:58.758 } 00:20:58.758 } 00:20:58.758 }' 00:20:58.758 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:58.758 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.017 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.275 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:59.275 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:59.275 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:20:59.275 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:20:59.533 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:20:59.533 "name": "pt2", 00:20:59.533 "aliases": [ 00:20:59.533 "00000000-0000-0000-0000-000000000002" 00:20:59.533 ], 00:20:59.533 "product_name": "passthru", 00:20:59.533 "block_size": 512, 00:20:59.533 "num_blocks": 65536, 00:20:59.533 "uuid": "00000000-0000-0000-0000-000000000002", 00:20:59.534 "assigned_rate_limits": { 00:20:59.534 "rw_ios_per_sec": 0, 00:20:59.534 "rw_mbytes_per_sec": 0, 00:20:59.534 "r_mbytes_per_sec": 0, 00:20:59.534 "w_mbytes_per_sec": 0 00:20:59.534 }, 00:20:59.534 "claimed": true, 00:20:59.534 "claim_type": "exclusive_write", 00:20:59.534 "zoned": false, 00:20:59.534 "supported_io_types": { 00:20:59.534 "read": true, 00:20:59.534 "write": true, 00:20:59.534 "unmap": true, 00:20:59.534 "write_zeroes": true, 00:20:59.534 "flush": true, 00:20:59.534 "reset": true, 00:20:59.534 "compare": false, 00:20:59.534 "compare_and_write": false, 00:20:59.534 "abort": true, 00:20:59.534 "nvme_admin": false, 00:20:59.534 "nvme_io": false 00:20:59.534 }, 00:20:59.534 "memory_domains": [ 00:20:59.534 { 00:20:59.534 "dma_device_id": "system", 00:20:59.534 "dma_device_type": 1 00:20:59.534 }, 00:20:59.534 { 00:20:59.534 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:20:59.534 "dma_device_type": 2 00:20:59.534 } 00:20:59.534 ], 00:20:59.534 "driver_specific": { 00:20:59.534 "passthru": { 00:20:59.534 "name": "pt2", 00:20:59.534 "base_bdev_name": "malloc2" 00:20:59.534 } 00:20:59.534 } 00:20:59.534 }' 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:20:59.534 16:00:04 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.534 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt3 00:20:59.790 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:00.048 "name": "pt3", 00:21:00.048 "aliases": [ 00:21:00.048 "00000000-0000-0000-0000-000000000003" 00:21:00.048 ], 00:21:00.048 "product_name": "passthru", 00:21:00.048 "block_size": 512, 00:21:00.048 "num_blocks": 65536, 00:21:00.048 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:00.048 "assigned_rate_limits": { 00:21:00.048 "rw_ios_per_sec": 0, 00:21:00.048 "rw_mbytes_per_sec": 0, 00:21:00.048 "r_mbytes_per_sec": 0, 00:21:00.048 "w_mbytes_per_sec": 0 00:21:00.048 }, 00:21:00.048 "claimed": true, 00:21:00.048 "claim_type": "exclusive_write", 00:21:00.048 "zoned": false, 00:21:00.048 "supported_io_types": { 00:21:00.048 "read": true, 00:21:00.048 "write": true, 00:21:00.048 "unmap": true, 00:21:00.048 "write_zeroes": true, 00:21:00.048 "flush": true, 00:21:00.048 "reset": true, 00:21:00.048 "compare": false, 00:21:00.048 "compare_and_write": false, 00:21:00.048 "abort": true, 00:21:00.048 "nvme_admin": false, 00:21:00.048 "nvme_io": false 00:21:00.048 }, 00:21:00.048 "memory_domains": [ 00:21:00.048 { 00:21:00.048 "dma_device_id": "system", 00:21:00.048 "dma_device_type": 1 00:21:00.048 }, 00:21:00.048 { 00:21:00.048 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.048 "dma_device_type": 2 00:21:00.048 } 00:21:00.048 ], 00:21:00.048 "driver_specific": { 00:21:00.048 "passthru": { 00:21:00.048 "name": "pt3", 00:21:00.048 "base_bdev_name": "malloc3" 00:21:00.048 } 00:21:00.048 } 00:21:00.048 }' 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.048 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt4 00:21:00.306 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:21:00.564 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:21:00.564 "name": "pt4", 00:21:00.564 "aliases": [ 00:21:00.564 "00000000-0000-0000-0000-000000000004" 00:21:00.564 ], 00:21:00.564 "product_name": "passthru", 00:21:00.564 "block_size": 512, 00:21:00.564 "num_blocks": 65536, 00:21:00.564 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:00.564 "assigned_rate_limits": { 00:21:00.564 "rw_ios_per_sec": 0, 00:21:00.564 "rw_mbytes_per_sec": 0, 00:21:00.564 "r_mbytes_per_sec": 0, 00:21:00.564 "w_mbytes_per_sec": 0 00:21:00.564 }, 00:21:00.564 "claimed": true, 00:21:00.564 "claim_type": "exclusive_write", 00:21:00.564 "zoned": false, 00:21:00.564 "supported_io_types": { 00:21:00.564 "read": true, 00:21:00.564 "write": true, 00:21:00.564 "unmap": true, 00:21:00.564 "write_zeroes": true, 00:21:00.564 "flush": true, 00:21:00.564 "reset": true, 00:21:00.564 "compare": false, 00:21:00.564 "compare_and_write": false, 00:21:00.564 "abort": true, 00:21:00.564 "nvme_admin": false, 00:21:00.564 "nvme_io": false 00:21:00.564 }, 00:21:00.564 "memory_domains": [ 00:21:00.564 { 00:21:00.564 "dma_device_id": "system", 00:21:00.564 "dma_device_type": 1 00:21:00.564 }, 00:21:00.564 { 00:21:00.564 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:21:00.564 "dma_device_type": 2 00:21:00.564 } 00:21:00.564 ], 00:21:00.564 "driver_specific": { 00:21:00.564 "passthru": { 00:21:00.564 "name": "pt4", 00:21:00.564 "base_bdev_name": "malloc4" 00:21:00.564 } 00:21:00.564 } 00:21:00.564 }' 00:21:00.564 16:00:05 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.564 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:21:00.564 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@205 -- # [[ 512 == 512 ]] 00:21:00.564 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:21:00.822 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:01.080 [2024-06-10 16:00:06.532449] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:01.080 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@486 -- # '[' 035bf9bc-9051-4994-a30b-08c1df0a6979 '!=' 035bf9bc-9051-4994-a30b-08c1df0a6979 ']' 00:21:01.080 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:21:01.080 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:01.080 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:01.080 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:21:01.339 [2024-06-10 16:00:06.788877] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:01.339 16:00:06 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:01.597 16:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:01.597 "name": "raid_bdev1", 00:21:01.597 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:01.597 "strip_size_kb": 0, 00:21:01.597 "state": "online", 00:21:01.597 "raid_level": "raid1", 00:21:01.597 "superblock": true, 00:21:01.597 "num_base_bdevs": 4, 00:21:01.597 "num_base_bdevs_discovered": 3, 00:21:01.597 "num_base_bdevs_operational": 3, 00:21:01.597 "base_bdevs_list": [ 00:21:01.597 { 00:21:01.597 "name": null, 00:21:01.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:01.597 "is_configured": false, 00:21:01.597 "data_offset": 2048, 00:21:01.597 "data_size": 63488 00:21:01.597 }, 00:21:01.597 { 00:21:01.597 "name": "pt2", 00:21:01.597 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:01.597 "is_configured": true, 00:21:01.597 "data_offset": 2048, 00:21:01.597 "data_size": 63488 00:21:01.597 }, 00:21:01.597 { 00:21:01.597 "name": "pt3", 00:21:01.597 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:01.597 "is_configured": true, 00:21:01.597 "data_offset": 2048, 00:21:01.597 "data_size": 63488 00:21:01.597 }, 00:21:01.597 { 00:21:01.597 "name": "pt4", 00:21:01.597 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:01.597 "is_configured": true, 00:21:01.597 "data_offset": 2048, 00:21:01.597 "data_size": 63488 00:21:01.597 } 00:21:01.597 ] 00:21:01.597 }' 00:21:01.597 16:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:01.597 16:00:07 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:02.163 16:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:02.421 [2024-06-10 16:00:07.875784] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:02.421 [2024-06-10 16:00:07.875811] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:02.422 [2024-06-10 16:00:07.875863] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:02.422 [2024-06-10 16:00:07.875931] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:02.422 [2024-06-10 16:00:07.875940] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c62e70 name raid_bdev1, state offline 00:21:02.422 16:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:02.422 16:00:07 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:21:02.681 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:21:02.681 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:21:02.681 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:21:02.681 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:02.681 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:21:02.939 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:02.939 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:02.939 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt3 00:21:03.197 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:03.197 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:03.197 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:03.454 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:21:03.454 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:21:03.454 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:21:03.454 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:03.454 16:00:08 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:21:03.712 [2024-06-10 16:00:09.143086] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:21:03.712 [2024-06-10 16:00:09.143130] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:03.712 [2024-06-10 16:00:09.143145] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c613a0 00:21:03.712 [2024-06-10 16:00:09.143154] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:03.712 [2024-06-10 16:00:09.144863] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:03.712 [2024-06-10 16:00:09.144893] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:21:03.712 [2024-06-10 16:00:09.144968] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:21:03.712 [2024-06-10 16:00:09.144997] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:03.712 pt2 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:03.712 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:03.970 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:03.970 "name": "raid_bdev1", 00:21:03.970 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:03.970 "strip_size_kb": 0, 00:21:03.970 "state": "configuring", 00:21:03.970 "raid_level": "raid1", 00:21:03.970 "superblock": true, 00:21:03.970 "num_base_bdevs": 4, 00:21:03.970 "num_base_bdevs_discovered": 1, 00:21:03.970 "num_base_bdevs_operational": 3, 00:21:03.970 "base_bdevs_list": [ 00:21:03.970 { 00:21:03.970 "name": null, 00:21:03.970 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:03.970 "is_configured": false, 00:21:03.970 "data_offset": 2048, 00:21:03.970 "data_size": 63488 00:21:03.970 }, 00:21:03.970 { 00:21:03.970 "name": "pt2", 00:21:03.970 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:03.970 "is_configured": true, 00:21:03.970 "data_offset": 2048, 00:21:03.970 "data_size": 63488 00:21:03.970 }, 00:21:03.970 { 00:21:03.970 "name": null, 00:21:03.970 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:03.970 "is_configured": false, 00:21:03.970 "data_offset": 2048, 00:21:03.970 "data_size": 63488 00:21:03.970 }, 00:21:03.970 { 00:21:03.970 "name": null, 00:21:03.970 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:03.970 "is_configured": false, 00:21:03.970 "data_offset": 2048, 00:21:03.970 "data_size": 63488 00:21:03.970 } 00:21:03.970 ] 00:21:03.970 }' 00:21:03.970 16:00:09 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:03.970 16:00:09 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:04.534 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:04.534 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:04.534 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@511 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc3 -p pt3 -u 00000000-0000-0000-0000-000000000003 00:21:04.792 [2024-06-10 16:00:10.250070] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc3 00:21:04.792 [2024-06-10 16:00:10.250119] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:04.792 [2024-06-10 16:00:10.250137] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c62780 00:21:04.792 [2024-06-10 16:00:10.250147] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:04.792 [2024-06-10 16:00:10.250496] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:04.792 [2024-06-10 16:00:10.250512] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt3 00:21:04.792 [2024-06-10 16:00:10.250574] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt3 00:21:04.792 [2024-06-10 16:00:10.250591] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:04.792 pt3 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@514 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:04.792 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.051 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:05.051 "name": "raid_bdev1", 00:21:05.051 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:05.051 "strip_size_kb": 0, 00:21:05.051 "state": "configuring", 00:21:05.051 "raid_level": "raid1", 00:21:05.051 "superblock": true, 00:21:05.051 "num_base_bdevs": 4, 00:21:05.051 "num_base_bdevs_discovered": 2, 00:21:05.051 "num_base_bdevs_operational": 3, 00:21:05.051 "base_bdevs_list": [ 00:21:05.051 { 00:21:05.051 "name": null, 00:21:05.051 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:05.051 "is_configured": false, 00:21:05.051 "data_offset": 2048, 00:21:05.051 "data_size": 63488 00:21:05.051 }, 00:21:05.051 { 00:21:05.051 "name": "pt2", 00:21:05.051 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:05.051 "is_configured": true, 00:21:05.051 "data_offset": 2048, 00:21:05.051 "data_size": 63488 00:21:05.051 }, 00:21:05.051 { 00:21:05.051 "name": "pt3", 00:21:05.051 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:05.051 "is_configured": true, 00:21:05.051 "data_offset": 2048, 00:21:05.051 "data_size": 63488 00:21:05.051 }, 00:21:05.051 { 00:21:05.051 "name": null, 00:21:05.051 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:05.051 "is_configured": false, 00:21:05.051 "data_offset": 2048, 00:21:05.051 "data_size": 63488 00:21:05.051 } 00:21:05.051 ] 00:21:05.051 }' 00:21:05.051 16:00:10 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:05.051 16:00:10 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:05.618 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i++ )) 00:21:05.618 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:21:05.618 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@518 -- # i=3 00:21:05.618 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:05.876 [2024-06-10 16:00:11.284847] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:05.876 [2024-06-10 16:00:11.284894] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:05.876 [2024-06-10 16:00:11.284913] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab6960 00:21:05.876 [2024-06-10 16:00:11.284923] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:05.876 [2024-06-10 16:00:11.285272] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:05.876 [2024-06-10 16:00:11.285287] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:05.876 [2024-06-10 16:00:11.285348] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:05.876 [2024-06-10 16:00:11.285366] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:05.876 [2024-06-10 16:00:11.285482] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c59ee0 00:21:05.876 [2024-06-10 16:00:11.285491] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:05.876 [2024-06-10 16:00:11.285668] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c5b770 00:21:05.876 [2024-06-10 16:00:11.285806] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c59ee0 00:21:05.876 [2024-06-10 16:00:11.285814] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c59ee0 00:21:05.876 [2024-06-10 16:00:11.285915] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:05.876 pt4 00:21:05.876 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:05.876 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:05.876 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:05.876 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:05.877 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:06.135 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:06.135 "name": "raid_bdev1", 00:21:06.135 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:06.135 "strip_size_kb": 0, 00:21:06.135 "state": "online", 00:21:06.135 "raid_level": "raid1", 00:21:06.135 "superblock": true, 00:21:06.135 "num_base_bdevs": 4, 00:21:06.135 "num_base_bdevs_discovered": 3, 00:21:06.135 "num_base_bdevs_operational": 3, 00:21:06.135 "base_bdevs_list": [ 00:21:06.135 { 00:21:06.135 "name": null, 00:21:06.135 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:06.135 "is_configured": false, 00:21:06.135 "data_offset": 2048, 00:21:06.135 "data_size": 63488 00:21:06.135 }, 00:21:06.135 { 00:21:06.135 "name": "pt2", 00:21:06.135 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:06.135 "is_configured": true, 00:21:06.135 "data_offset": 2048, 00:21:06.135 "data_size": 63488 00:21:06.135 }, 00:21:06.135 { 00:21:06.135 "name": "pt3", 00:21:06.135 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:06.135 "is_configured": true, 00:21:06.135 "data_offset": 2048, 00:21:06.135 "data_size": 63488 00:21:06.135 }, 00:21:06.135 { 00:21:06.135 "name": "pt4", 00:21:06.135 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:06.135 "is_configured": true, 00:21:06.135 "data_offset": 2048, 00:21:06.135 "data_size": 63488 00:21:06.135 } 00:21:06.135 ] 00:21:06.135 }' 00:21:06.135 16:00:11 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:06.135 16:00:11 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:06.702 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:06.961 [2024-06-10 16:00:12.439936] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:06.961 [2024-06-10 16:00:12.439964] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:06.961 [2024-06-10 16:00:12.440014] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:06.961 [2024-06-10 16:00:12.440079] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:06.961 [2024-06-10 16:00:12.440087] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c59ee0 name raid_bdev1, state offline 00:21:06.961 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:06.961 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:21:07.218 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:21:07.218 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:21:07.218 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@531 -- # '[' 4 -gt 2 ']' 00:21:07.218 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@533 -- # i=3 00:21:07.218 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@534 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt4 00:21:07.476 16:00:12 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:21:07.734 [2024-06-10 16:00:13.213975] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:21:07.734 [2024-06-10 16:00:13.214017] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:07.734 [2024-06-10 16:00:13.214032] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ab6960 00:21:07.734 [2024-06-10 16:00:13.214041] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:07.734 [2024-06-10 16:00:13.215716] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:07.734 [2024-06-10 16:00:13.215743] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:21:07.734 [2024-06-10 16:00:13.215801] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:21:07.734 [2024-06-10 16:00:13.215827] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:21:07.734 [2024-06-10 16:00:13.215928] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:21:07.734 [2024-06-10 16:00:13.215944] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:07.734 [2024-06-10 16:00:13.215966] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c68e50 name raid_bdev1, state configuring 00:21:07.734 [2024-06-10 16:00:13.215995] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:21:07.734 [2024-06-10 16:00:13.216078] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt3 is claimed 00:21:07.734 pt1 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@541 -- # '[' 4 -gt 2 ']' 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@544 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 3 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:07.734 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:07.992 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:07.992 "name": "raid_bdev1", 00:21:07.992 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:07.992 "strip_size_kb": 0, 00:21:07.992 "state": "configuring", 00:21:07.992 "raid_level": "raid1", 00:21:07.992 "superblock": true, 00:21:07.992 "num_base_bdevs": 4, 00:21:07.992 "num_base_bdevs_discovered": 2, 00:21:07.992 "num_base_bdevs_operational": 3, 00:21:07.992 "base_bdevs_list": [ 00:21:07.992 { 00:21:07.992 "name": null, 00:21:07.992 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:07.992 "is_configured": false, 00:21:07.992 "data_offset": 2048, 00:21:07.992 "data_size": 63488 00:21:07.992 }, 00:21:07.992 { 00:21:07.992 "name": "pt2", 00:21:07.992 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:07.992 "is_configured": true, 00:21:07.992 "data_offset": 2048, 00:21:07.992 "data_size": 63488 00:21:07.992 }, 00:21:07.992 { 00:21:07.992 "name": "pt3", 00:21:07.992 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:07.992 "is_configured": true, 00:21:07.992 "data_offset": 2048, 00:21:07.992 "data_size": 63488 00:21:07.992 }, 00:21:07.992 { 00:21:07.992 "name": null, 00:21:07.992 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:07.992 "is_configured": false, 00:21:07.992 "data_offset": 2048, 00:21:07.992 "data_size": 63488 00:21:07.992 } 00:21:07.992 ] 00:21:07.992 }' 00:21:07.992 16:00:13 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:07.992 16:00:13 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:08.926 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs configuring 00:21:08.926 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:08.926 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@545 -- # [[ false == \f\a\l\s\e ]] 00:21:08.926 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@548 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc4 -p pt4 -u 00000000-0000-0000-0000-000000000004 00:21:09.184 [2024-06-10 16:00:14.601677] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc4 00:21:09.184 [2024-06-10 16:00:14.601727] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:09.184 [2024-06-10 16:00:14.601745] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c623f0 00:21:09.184 [2024-06-10 16:00:14.601755] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:09.184 [2024-06-10 16:00:14.602121] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:09.184 [2024-06-10 16:00:14.602137] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt4 00:21:09.184 [2024-06-10 16:00:14.602196] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt4 00:21:09.184 [2024-06-10 16:00:14.602215] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt4 is claimed 00:21:09.185 [2024-06-10 16:00:14.602329] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1ab7310 00:21:09.185 [2024-06-10 16:00:14.602338] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:09.185 [2024-06-10 16:00:14.602516] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c666b0 00:21:09.185 [2024-06-10 16:00:14.602655] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1ab7310 00:21:09.185 [2024-06-10 16:00:14.602664] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1ab7310 00:21:09.185 [2024-06-10 16:00:14.602764] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:09.185 pt4 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:09.185 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:09.444 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:09.444 "name": "raid_bdev1", 00:21:09.444 "uuid": "035bf9bc-9051-4994-a30b-08c1df0a6979", 00:21:09.444 "strip_size_kb": 0, 00:21:09.444 "state": "online", 00:21:09.444 "raid_level": "raid1", 00:21:09.444 "superblock": true, 00:21:09.444 "num_base_bdevs": 4, 00:21:09.444 "num_base_bdevs_discovered": 3, 00:21:09.444 "num_base_bdevs_operational": 3, 00:21:09.444 "base_bdevs_list": [ 00:21:09.444 { 00:21:09.444 "name": null, 00:21:09.444 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:09.444 "is_configured": false, 00:21:09.444 "data_offset": 2048, 00:21:09.444 "data_size": 63488 00:21:09.444 }, 00:21:09.444 { 00:21:09.444 "name": "pt2", 00:21:09.444 "uuid": "00000000-0000-0000-0000-000000000002", 00:21:09.444 "is_configured": true, 00:21:09.444 "data_offset": 2048, 00:21:09.444 "data_size": 63488 00:21:09.444 }, 00:21:09.444 { 00:21:09.444 "name": "pt3", 00:21:09.444 "uuid": "00000000-0000-0000-0000-000000000003", 00:21:09.444 "is_configured": true, 00:21:09.444 "data_offset": 2048, 00:21:09.444 "data_size": 63488 00:21:09.444 }, 00:21:09.444 { 00:21:09.444 "name": "pt4", 00:21:09.444 "uuid": "00000000-0000-0000-0000-000000000004", 00:21:09.444 "is_configured": true, 00:21:09.444 "data_offset": 2048, 00:21:09.444 "data_size": 63488 00:21:09.444 } 00:21:09.444 ] 00:21:09.444 }' 00:21:09.444 16:00:14 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:09.444 16:00:14 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.013 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:21:10.013 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:21:10.272 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:21:10.272 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:10.272 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:21:10.531 [2024-06-10 16:00:15.881383] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@557 -- # '[' 035bf9bc-9051-4994-a30b-08c1df0a6979 '!=' 035bf9bc-9051-4994-a30b-08c1df0a6979 ']' 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@562 -- # killprocess 2754953 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@949 -- # '[' -z 2754953 ']' 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@953 -- # kill -0 2754953 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # uname 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2754953 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2754953' 00:21:10.531 killing process with pid 2754953 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@968 -- # kill 2754953 00:21:10.531 [2024-06-10 16:00:15.948383] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:10.531 [2024-06-10 16:00:15.948441] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:10.531 [2024-06-10 16:00:15.948510] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:10.531 [2024-06-10 16:00:15.948521] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1ab7310 name raid_bdev1, state offline 00:21:10.531 16:00:15 bdev_raid.raid_superblock_test -- common/autotest_common.sh@973 -- # wait 2754953 00:21:10.531 [2024-06-10 16:00:15.982972] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:10.789 16:00:16 bdev_raid.raid_superblock_test -- bdev/bdev_raid.sh@564 -- # return 0 00:21:10.789 00:21:10.789 real 0m25.417s 00:21:10.789 user 0m47.568s 00:21:10.789 sys 0m3.514s 00:21:10.789 16:00:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:10.789 16:00:16 bdev_raid.raid_superblock_test -- common/autotest_common.sh@10 -- # set +x 00:21:10.789 ************************************ 00:21:10.789 END TEST raid_superblock_test 00:21:10.789 ************************************ 00:21:10.789 16:00:16 bdev_raid -- bdev/bdev_raid.sh@870 -- # run_test raid_read_error_test raid_io_error_test raid1 4 read 00:21:10.789 16:00:16 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:21:10.789 16:00:16 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:10.789 16:00:16 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:10.789 ************************************ 00:21:10.789 START TEST raid_read_error_test 00:21:10.789 ************************************ 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 read 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=read 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.9lE0FNsmyf 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2760204 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2760204 /var/tmp/spdk-raid.sock 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@830 -- # '[' -z 2760204 ']' 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:10.789 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:10.789 16:00:16 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:11.047 [2024-06-10 16:00:16.316501] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:21:11.047 [2024-06-10 16:00:16.316557] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2760204 ] 00:21:11.047 [2024-06-10 16:00:16.416162] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:11.047 [2024-06-10 16:00:16.516174] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:11.306 [2024-06-10 16:00:16.579340] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:11.306 [2024-06-10 16:00:16.579371] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:11.873 16:00:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:11.873 16:00:17 bdev_raid.raid_read_error_test -- common/autotest_common.sh@863 -- # return 0 00:21:11.873 16:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:11.873 16:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:12.130 BaseBdev1_malloc 00:21:12.130 16:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:12.388 true 00:21:12.388 16:00:17 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:12.647 [2024-06-10 16:00:18.029281] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:12.647 [2024-06-10 16:00:18.029323] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:12.647 [2024-06-10 16:00:18.029342] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127d150 00:21:12.647 [2024-06-10 16:00:18.029351] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:12.647 [2024-06-10 16:00:18.031243] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:12.647 [2024-06-10 16:00:18.031274] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:12.647 BaseBdev1 00:21:12.647 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:12.647 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:12.907 BaseBdev2_malloc 00:21:12.907 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:13.165 true 00:21:13.165 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:13.424 [2024-06-10 16:00:18.795936] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:13.424 [2024-06-10 16:00:18.795982] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:13.424 [2024-06-10 16:00:18.795999] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1281b50 00:21:13.424 [2024-06-10 16:00:18.796008] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:13.424 [2024-06-10 16:00:18.797600] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:13.424 [2024-06-10 16:00:18.797627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:13.424 BaseBdev2 00:21:13.424 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:13.424 16:00:18 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:13.682 BaseBdev3_malloc 00:21:13.682 16:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:13.983 true 00:21:13.983 16:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:14.241 [2024-06-10 16:00:19.554525] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:14.241 [2024-06-10 16:00:19.554564] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:14.241 [2024-06-10 16:00:19.554581] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1282780 00:21:14.241 [2024-06-10 16:00:19.554591] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:14.241 [2024-06-10 16:00:19.556137] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:14.241 [2024-06-10 16:00:19.556165] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:14.241 BaseBdev3 00:21:14.241 16:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:14.241 16:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:14.499 BaseBdev4_malloc 00:21:14.499 16:00:19 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:14.758 true 00:21:14.758 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:15.015 [2024-06-10 16:00:20.321127] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:15.015 [2024-06-10 16:00:20.321168] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:15.015 [2024-06-10 16:00:20.321188] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x127bee0 00:21:15.015 [2024-06-10 16:00:20.321197] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:15.015 [2024-06-10 16:00:20.322816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:15.015 [2024-06-10 16:00:20.322843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:15.015 BaseBdev4 00:21:15.015 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:15.274 [2024-06-10 16:00:20.573820] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:15.274 [2024-06-10 16:00:20.575178] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:15.274 [2024-06-10 16:00:20.575247] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:15.274 [2024-06-10 16:00:20.575311] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:15.274 [2024-06-10 16:00:20.575553] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x12855f0 00:21:15.274 [2024-06-10 16:00:20.575564] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:15.274 [2024-06-10 16:00:20.575756] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10cc510 00:21:15.274 [2024-06-10 16:00:20.575924] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x12855f0 00:21:15.274 [2024-06-10 16:00:20.575933] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x12855f0 00:21:15.274 [2024-06-10 16:00:20.576048] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:15.274 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:15.532 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:15.532 "name": "raid_bdev1", 00:21:15.532 "uuid": "a2868437-899c-4367-8b24-47be80cffdd8", 00:21:15.532 "strip_size_kb": 0, 00:21:15.532 "state": "online", 00:21:15.532 "raid_level": "raid1", 00:21:15.532 "superblock": true, 00:21:15.532 "num_base_bdevs": 4, 00:21:15.532 "num_base_bdevs_discovered": 4, 00:21:15.532 "num_base_bdevs_operational": 4, 00:21:15.532 "base_bdevs_list": [ 00:21:15.532 { 00:21:15.532 "name": "BaseBdev1", 00:21:15.532 "uuid": "60a592d7-97d7-51ed-8227-1b747fe683b7", 00:21:15.532 "is_configured": true, 00:21:15.532 "data_offset": 2048, 00:21:15.532 "data_size": 63488 00:21:15.532 }, 00:21:15.532 { 00:21:15.532 "name": "BaseBdev2", 00:21:15.532 "uuid": "733d37eb-1c65-56dc-aea7-23cca07ded92", 00:21:15.532 "is_configured": true, 00:21:15.532 "data_offset": 2048, 00:21:15.532 "data_size": 63488 00:21:15.532 }, 00:21:15.532 { 00:21:15.532 "name": "BaseBdev3", 00:21:15.532 "uuid": "f6e40172-f76b-5aa4-a301-b1e601a97bae", 00:21:15.532 "is_configured": true, 00:21:15.532 "data_offset": 2048, 00:21:15.532 "data_size": 63488 00:21:15.532 }, 00:21:15.532 { 00:21:15.532 "name": "BaseBdev4", 00:21:15.532 "uuid": "03ccf7cd-82c2-5c3a-9676-2ad78c4adc79", 00:21:15.532 "is_configured": true, 00:21:15.532 "data_offset": 2048, 00:21:15.532 "data_size": 63488 00:21:15.532 } 00:21:15.532 ] 00:21:15.532 }' 00:21:15.532 16:00:20 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:15.532 16:00:20 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:16.099 16:00:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:16.099 16:00:21 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:16.099 [2024-06-10 16:00:21.540650] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10cc450 00:21:17.045 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc read failure 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@830 -- # [[ read = \w\r\i\t\e ]] 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@833 -- # expected_num_base_bdevs=4 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:17.309 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:17.568 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:17.568 "name": "raid_bdev1", 00:21:17.568 "uuid": "a2868437-899c-4367-8b24-47be80cffdd8", 00:21:17.568 "strip_size_kb": 0, 00:21:17.568 "state": "online", 00:21:17.568 "raid_level": "raid1", 00:21:17.568 "superblock": true, 00:21:17.568 "num_base_bdevs": 4, 00:21:17.568 "num_base_bdevs_discovered": 4, 00:21:17.568 "num_base_bdevs_operational": 4, 00:21:17.568 "base_bdevs_list": [ 00:21:17.568 { 00:21:17.568 "name": "BaseBdev1", 00:21:17.568 "uuid": "60a592d7-97d7-51ed-8227-1b747fe683b7", 00:21:17.568 "is_configured": true, 00:21:17.568 "data_offset": 2048, 00:21:17.568 "data_size": 63488 00:21:17.568 }, 00:21:17.568 { 00:21:17.568 "name": "BaseBdev2", 00:21:17.568 "uuid": "733d37eb-1c65-56dc-aea7-23cca07ded92", 00:21:17.568 "is_configured": true, 00:21:17.568 "data_offset": 2048, 00:21:17.568 "data_size": 63488 00:21:17.568 }, 00:21:17.568 { 00:21:17.568 "name": "BaseBdev3", 00:21:17.568 "uuid": "f6e40172-f76b-5aa4-a301-b1e601a97bae", 00:21:17.568 "is_configured": true, 00:21:17.568 "data_offset": 2048, 00:21:17.568 "data_size": 63488 00:21:17.568 }, 00:21:17.568 { 00:21:17.568 "name": "BaseBdev4", 00:21:17.568 "uuid": "03ccf7cd-82c2-5c3a-9676-2ad78c4adc79", 00:21:17.568 "is_configured": true, 00:21:17.568 "data_offset": 2048, 00:21:17.568 "data_size": 63488 00:21:17.568 } 00:21:17.568 ] 00:21:17.568 }' 00:21:17.568 16:00:22 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:17.568 16:00:22 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.135 16:00:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:18.393 [2024-06-10 16:00:23.816484] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:18.393 [2024-06-10 16:00:23.816518] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:18.393 [2024-06-10 16:00:23.819913] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:18.393 [2024-06-10 16:00:23.819952] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:18.393 [2024-06-10 16:00:23.820087] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:18.393 [2024-06-10 16:00:23.820097] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x12855f0 name raid_bdev1, state offline 00:21:18.393 0 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2760204 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@949 -- # '[' -z 2760204 ']' 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@953 -- # kill -0 2760204 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # uname 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2760204 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:18.393 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2760204' 00:21:18.393 killing process with pid 2760204 00:21:18.394 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@968 -- # kill 2760204 00:21:18.394 [2024-06-10 16:00:23.891870] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:18.394 16:00:23 bdev_raid.raid_read_error_test -- common/autotest_common.sh@973 -- # wait 2760204 00:21:18.653 [2024-06-10 16:00:23.921273] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.9lE0FNsmyf 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:21:18.653 00:21:18.653 real 0m7.886s 00:21:18.653 user 0m12.979s 00:21:18.653 sys 0m1.118s 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:18.653 16:00:24 bdev_raid.raid_read_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.653 ************************************ 00:21:18.653 END TEST raid_read_error_test 00:21:18.653 ************************************ 00:21:18.912 16:00:24 bdev_raid -- bdev/bdev_raid.sh@871 -- # run_test raid_write_error_test raid_io_error_test raid1 4 write 00:21:18.912 16:00:24 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:21:18.912 16:00:24 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:18.912 16:00:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:18.912 ************************************ 00:21:18.912 START TEST raid_write_error_test 00:21:18.912 ************************************ 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1124 -- # raid_io_error_test raid1 4 write 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@788 -- # local raid_level=raid1 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@789 -- # local num_base_bdevs=4 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@790 -- # local error_io_type=write 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i = 1 )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev1 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev2 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev3 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # echo BaseBdev4 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i++ )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # (( i <= num_base_bdevs )) 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@791 -- # local base_bdevs 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@792 -- # local raid_bdev_name=raid_bdev1 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@793 -- # local strip_size 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@794 -- # local create_arg 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@795 -- # local bdevperf_log 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@796 -- # local fail_per_s 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@798 -- # '[' raid1 '!=' raid1 ']' 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@802 -- # strip_size=0 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # mktemp -p /raidtest 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@805 -- # bdevperf_log=/raidtest/tmp.Bt2MfLI1tk 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@808 -- # raid_pid=2761533 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@809 -- # waitforlisten 2761533 /var/tmp/spdk-raid.sock 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@807 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 128k -q 1 -z -f -L bdev_raid 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@830 -- # '[' -z 2761533 ']' 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:18.912 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:18.912 16:00:24 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:18.912 [2024-06-10 16:00:24.275161] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:21:18.912 [2024-06-10 16:00:24.275218] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2761533 ] 00:21:18.912 [2024-06-10 16:00:24.374911] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:19.170 [2024-06-10 16:00:24.470035] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:19.170 [2024-06-10 16:00:24.536717] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:19.170 [2024-06-10 16:00:24.536748] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:19.744 16:00:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:19.744 16:00:25 bdev_raid.raid_write_error_test -- common/autotest_common.sh@863 -- # return 0 00:21:19.744 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:19.744 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:20.004 BaseBdev1_malloc 00:21:20.004 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev1_malloc 00:21:20.004 true 00:21:20.004 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev1_malloc -p BaseBdev1 00:21:20.263 [2024-06-10 16:00:25.633919] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev1_malloc 00:21:20.263 [2024-06-10 16:00:25.633968] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:20.263 [2024-06-10 16:00:25.633985] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xccb150 00:21:20.263 [2024-06-10 16:00:25.633995] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:20.263 [2024-06-10 16:00:25.635722] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:20.263 [2024-06-10 16:00:25.635751] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:20.263 BaseBdev1 00:21:20.263 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:20.263 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:20.522 BaseBdev2_malloc 00:21:20.522 16:00:25 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev2_malloc 00:21:20.522 true 00:21:20.522 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev2_malloc -p BaseBdev2 00:21:20.779 [2024-06-10 16:00:26.240078] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev2_malloc 00:21:20.779 [2024-06-10 16:00:26.240123] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:20.779 [2024-06-10 16:00:26.240141] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xccfb50 00:21:20.779 [2024-06-10 16:00:26.240150] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:20.779 [2024-06-10 16:00:26.241666] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:20.779 [2024-06-10 16:00:26.241692] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:20.779 BaseBdev2 00:21:20.779 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:20.779 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:21:21.037 BaseBdev3_malloc 00:21:21.037 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev3_malloc 00:21:21.296 true 00:21:21.296 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev3_malloc -p BaseBdev3 00:21:21.555 [2024-06-10 16:00:26.834236] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev3_malloc 00:21:21.555 [2024-06-10 16:00:26.834277] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:21.555 [2024-06-10 16:00:26.834293] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcd0780 00:21:21.555 [2024-06-10 16:00:26.834303] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:21.555 [2024-06-10 16:00:26.835757] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:21.555 [2024-06-10 16:00:26.835782] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:21:21.555 BaseBdev3 00:21:21.555 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@812 -- # for bdev in "${base_bdevs[@]}" 00:21:21.555 16:00:26 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@813 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:21:21.555 BaseBdev4_malloc 00:21:21.555 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@814 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_create BaseBdev4_malloc 00:21:21.814 true 00:21:21.814 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@815 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b EE_BaseBdev4_malloc -p BaseBdev4 00:21:22.074 [2024-06-10 16:00:27.352020] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on EE_BaseBdev4_malloc 00:21:22.074 [2024-06-10 16:00:27.352056] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:22.074 [2024-06-10 16:00:27.352074] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xcc9ee0 00:21:22.074 [2024-06-10 16:00:27.352083] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:22.074 [2024-06-10 16:00:27.353548] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:22.074 [2024-06-10 16:00:27.353573] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:21:22.074 BaseBdev4 00:21:22.074 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@819 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 -s 00:21:22.332 [2024-06-10 16:00:27.608726] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:22.332 [2024-06-10 16:00:27.609980] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:22.333 [2024-06-10 16:00:27.610048] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:21:22.333 [2024-06-10 16:00:27.610112] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:21:22.333 [2024-06-10 16:00:27.610345] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xcd35f0 00:21:22.333 [2024-06-10 16:00:27.610356] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:22.333 [2024-06-10 16:00:27.610530] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb1a510 00:21:22.333 [2024-06-10 16:00:27.610691] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xcd35f0 00:21:22.333 [2024-06-10 16:00:27.610700] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xcd35f0 00:21:22.333 [2024-06-10 16:00:27.610802] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@820 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:22.333 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:22.591 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:22.591 "name": "raid_bdev1", 00:21:22.591 "uuid": "58f973da-35c5-4a9b-9e5a-4e725a0dee44", 00:21:22.591 "strip_size_kb": 0, 00:21:22.591 "state": "online", 00:21:22.591 "raid_level": "raid1", 00:21:22.591 "superblock": true, 00:21:22.591 "num_base_bdevs": 4, 00:21:22.591 "num_base_bdevs_discovered": 4, 00:21:22.591 "num_base_bdevs_operational": 4, 00:21:22.591 "base_bdevs_list": [ 00:21:22.591 { 00:21:22.591 "name": "BaseBdev1", 00:21:22.591 "uuid": "819409a4-698c-5490-bddb-680020138ae1", 00:21:22.591 "is_configured": true, 00:21:22.591 "data_offset": 2048, 00:21:22.591 "data_size": 63488 00:21:22.591 }, 00:21:22.591 { 00:21:22.591 "name": "BaseBdev2", 00:21:22.591 "uuid": "5899382b-d81c-5099-925b-e2bb45702db8", 00:21:22.591 "is_configured": true, 00:21:22.591 "data_offset": 2048, 00:21:22.591 "data_size": 63488 00:21:22.591 }, 00:21:22.591 { 00:21:22.591 "name": "BaseBdev3", 00:21:22.591 "uuid": "4480180a-4421-56eb-92a1-dedd103d2221", 00:21:22.591 "is_configured": true, 00:21:22.591 "data_offset": 2048, 00:21:22.591 "data_size": 63488 00:21:22.591 }, 00:21:22.591 { 00:21:22.591 "name": "BaseBdev4", 00:21:22.591 "uuid": "6ffdf234-7d42-56e7-865a-1c9c80b42f19", 00:21:22.591 "is_configured": true, 00:21:22.591 "data_offset": 2048, 00:21:22.591 "data_size": 63488 00:21:22.591 } 00:21:22.591 ] 00:21:22.591 }' 00:21:22.591 16:00:27 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:22.591 16:00:27 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:23.158 16:00:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@824 -- # sleep 1 00:21:23.158 16:00:28 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@823 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:21:23.158 [2024-06-10 16:00:28.635739] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xb1a450 00:21:24.093 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@827 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_error_inject_error EE_BaseBdev1_malloc write failure 00:21:24.352 [2024-06-10 16:00:29.759200] bdev_raid.c:2221:_raid_bdev_fail_base_bdev: *NOTICE*: Failing base bdev in slot 0 ('BaseBdev1') of raid bdev 'raid_bdev1' 00:21:24.352 [2024-06-10 16:00:29.759260] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:24.352 [2024-06-10 16:00:29.759480] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xb1a450 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@829 -- # local expected_num_base_bdevs 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ raid1 = \r\a\i\d\1 ]] 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@830 -- # [[ write = \w\r\i\t\e ]] 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@831 -- # expected_num_base_bdevs=3 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@835 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:24.352 16:00:29 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:24.611 16:00:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:24.611 "name": "raid_bdev1", 00:21:24.611 "uuid": "58f973da-35c5-4a9b-9e5a-4e725a0dee44", 00:21:24.611 "strip_size_kb": 0, 00:21:24.611 "state": "online", 00:21:24.611 "raid_level": "raid1", 00:21:24.611 "superblock": true, 00:21:24.611 "num_base_bdevs": 4, 00:21:24.611 "num_base_bdevs_discovered": 3, 00:21:24.611 "num_base_bdevs_operational": 3, 00:21:24.611 "base_bdevs_list": [ 00:21:24.611 { 00:21:24.611 "name": null, 00:21:24.611 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:24.611 "is_configured": false, 00:21:24.611 "data_offset": 2048, 00:21:24.611 "data_size": 63488 00:21:24.611 }, 00:21:24.611 { 00:21:24.611 "name": "BaseBdev2", 00:21:24.611 "uuid": "5899382b-d81c-5099-925b-e2bb45702db8", 00:21:24.611 "is_configured": true, 00:21:24.611 "data_offset": 2048, 00:21:24.611 "data_size": 63488 00:21:24.611 }, 00:21:24.611 { 00:21:24.611 "name": "BaseBdev3", 00:21:24.611 "uuid": "4480180a-4421-56eb-92a1-dedd103d2221", 00:21:24.611 "is_configured": true, 00:21:24.611 "data_offset": 2048, 00:21:24.611 "data_size": 63488 00:21:24.611 }, 00:21:24.611 { 00:21:24.611 "name": "BaseBdev4", 00:21:24.611 "uuid": "6ffdf234-7d42-56e7-865a-1c9c80b42f19", 00:21:24.611 "is_configured": true, 00:21:24.611 "data_offset": 2048, 00:21:24.611 "data_size": 63488 00:21:24.611 } 00:21:24.611 ] 00:21:24.611 }' 00:21:24.611 16:00:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:24.611 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.179 16:00:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@837 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:25.443 [2024-06-10 16:00:30.914987] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:25.443 [2024-06-10 16:00:30.915027] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:25.443 [2024-06-10 16:00:30.918430] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:25.443 [2024-06-10 16:00:30.918466] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:25.443 [2024-06-10 16:00:30.918570] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:25.443 [2024-06-10 16:00:30.918579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xcd35f0 name raid_bdev1, state offline 00:21:25.443 0 00:21:25.443 16:00:30 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@839 -- # killprocess 2761533 00:21:25.444 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@949 -- # '[' -z 2761533 ']' 00:21:25.444 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@953 -- # kill -0 2761533 00:21:25.444 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # uname 00:21:25.444 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:25.444 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2761533 00:21:25.707 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:25.707 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:25.707 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2761533' 00:21:25.707 killing process with pid 2761533 00:21:25.707 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@968 -- # kill 2761533 00:21:25.707 [2024-06-10 16:00:30.977756] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:25.707 16:00:30 bdev_raid.raid_write_error_test -- common/autotest_common.sh@973 -- # wait 2761533 00:21:25.707 [2024-06-10 16:00:31.006148] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:25.707 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep -v Job /raidtest/tmp.Bt2MfLI1tk 00:21:25.707 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # grep raid_bdev1 00:21:25.707 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # awk '{print $6}' 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@843 -- # fail_per_s=0.00 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@844 -- # has_redundancy raid1 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@213 -- # case $1 in 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@214 -- # return 0 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- bdev/bdev_raid.sh@845 -- # [[ 0.00 = \0\.\0\0 ]] 00:21:25.966 00:21:25.966 real 0m7.018s 00:21:25.966 user 0m11.383s 00:21:25.966 sys 0m0.982s 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:25.966 16:00:31 bdev_raid.raid_write_error_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.966 ************************************ 00:21:25.966 END TEST raid_write_error_test 00:21:25.966 ************************************ 00:21:25.966 16:00:31 bdev_raid -- bdev/bdev_raid.sh@875 -- # '[' true = true ']' 00:21:25.966 16:00:31 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:21:25.966 16:00:31 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 2 false false true 00:21:25.966 16:00:31 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:21:25.966 16:00:31 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:25.966 16:00:31 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:25.966 ************************************ 00:21:25.966 START TEST raid_rebuild_test 00:21:25.966 ************************************ 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false false true 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2762765 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2762765 /var/tmp/spdk-raid.sock 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 2762765 ']' 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:25.966 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:25.966 16:00:31 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:25.966 [2024-06-10 16:00:31.355404] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:21:25.966 [2024-06-10 16:00:31.355468] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2762765 ] 00:21:25.966 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:25.966 Zero copy mechanism will not be used. 00:21:25.966 [2024-06-10 16:00:31.443621] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:26.225 [2024-06-10 16:00:31.538700] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:26.225 [2024-06-10 16:00:31.600846] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:26.225 [2024-06-10 16:00:31.600879] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:27.158 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:27.158 16:00:32 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:21:27.158 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:27.158 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:27.158 BaseBdev1_malloc 00:21:27.158 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:27.418 [2024-06-10 16:00:32.801880] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:27.418 [2024-06-10 16:00:32.801932] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.418 [2024-06-10 16:00:32.801950] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cabe90 00:21:27.418 [2024-06-10 16:00:32.801966] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.418 [2024-06-10 16:00:32.803627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.418 [2024-06-10 16:00:32.803654] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:27.418 BaseBdev1 00:21:27.418 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:27.418 16:00:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:27.676 BaseBdev2_malloc 00:21:27.677 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:27.934 [2024-06-10 16:00:33.319797] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:27.934 [2024-06-10 16:00:33.319836] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:27.934 [2024-06-10 16:00:33.319855] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1cac9e0 00:21:27.934 [2024-06-10 16:00:33.319869] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:27.934 [2024-06-10 16:00:33.321507] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:27.934 [2024-06-10 16:00:33.321534] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:27.934 BaseBdev2 00:21:27.934 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:28.193 spare_malloc 00:21:28.193 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:28.452 spare_delay 00:21:28.452 16:00:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:28.710 [2024-06-10 16:00:34.098109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:28.710 [2024-06-10 16:00:34.098145] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:28.710 [2024-06-10 16:00:34.098160] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e5a890 00:21:28.710 [2024-06-10 16:00:34.098170] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:28.710 [2024-06-10 16:00:34.099622] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:28.710 [2024-06-10 16:00:34.099647] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:28.710 spare 00:21:28.710 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:28.969 [2024-06-10 16:00:34.350797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:28.969 [2024-06-10 16:00:34.352033] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:28.969 [2024-06-10 16:00:34.352107] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1e59da0 00:21:28.969 [2024-06-10 16:00:34.352117] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:21:28.969 [2024-06-10 16:00:34.352312] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e56780 00:21:28.969 [2024-06-10 16:00:34.352450] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1e59da0 00:21:28.969 [2024-06-10 16:00:34.352459] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1e59da0 00:21:28.969 [2024-06-10 16:00:34.352567] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:28.969 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:29.234 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:29.234 "name": "raid_bdev1", 00:21:29.234 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:29.234 "strip_size_kb": 0, 00:21:29.234 "state": "online", 00:21:29.234 "raid_level": "raid1", 00:21:29.234 "superblock": false, 00:21:29.234 "num_base_bdevs": 2, 00:21:29.234 "num_base_bdevs_discovered": 2, 00:21:29.234 "num_base_bdevs_operational": 2, 00:21:29.234 "base_bdevs_list": [ 00:21:29.234 { 00:21:29.234 "name": "BaseBdev1", 00:21:29.234 "uuid": "fe857a99-c08d-5855-9609-53144e41f365", 00:21:29.234 "is_configured": true, 00:21:29.234 "data_offset": 0, 00:21:29.234 "data_size": 65536 00:21:29.234 }, 00:21:29.234 { 00:21:29.234 "name": "BaseBdev2", 00:21:29.234 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:29.234 "is_configured": true, 00:21:29.234 "data_offset": 0, 00:21:29.234 "data_size": 65536 00:21:29.234 } 00:21:29.234 ] 00:21:29.234 }' 00:21:29.234 16:00:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:29.234 16:00:34 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:29.800 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:29.800 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:30.059 [2024-06-10 16:00:35.474036] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:30.059 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:21:30.059 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:30.059 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:30.318 16:00:35 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:30.577 [2024-06-10 16:00:35.991220] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e5d8c0 00:21:30.577 /dev/nbd0 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:30.577 1+0 records in 00:21:30.577 1+0 records out 00:21:30.577 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229351 s, 17.9 MB/s 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:30.577 16:00:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:21:35.861 65536+0 records in 00:21:35.861 65536+0 records out 00:21:35.861 33554432 bytes (34 MB, 32 MiB) copied, 5.11181 s, 6.6 MB/s 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:35.861 [2024-06-10 16:00:41.355652] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:35.861 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:36.120 [2024-06-10 16:00:41.596363] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:36.120 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:36.378 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:36.378 "name": "raid_bdev1", 00:21:36.378 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:36.378 "strip_size_kb": 0, 00:21:36.378 "state": "online", 00:21:36.378 "raid_level": "raid1", 00:21:36.378 "superblock": false, 00:21:36.378 "num_base_bdevs": 2, 00:21:36.378 "num_base_bdevs_discovered": 1, 00:21:36.378 "num_base_bdevs_operational": 1, 00:21:36.378 "base_bdevs_list": [ 00:21:36.378 { 00:21:36.378 "name": null, 00:21:36.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:36.378 "is_configured": false, 00:21:36.378 "data_offset": 0, 00:21:36.378 "data_size": 65536 00:21:36.378 }, 00:21:36.378 { 00:21:36.378 "name": "BaseBdev2", 00:21:36.378 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:36.378 "is_configured": true, 00:21:36.378 "data_offset": 0, 00:21:36.378 "data_size": 65536 00:21:36.378 } 00:21:36.378 ] 00:21:36.378 }' 00:21:36.378 16:00:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:36.378 16:00:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:37.314 16:00:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:37.314 [2024-06-10 16:00:42.731426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:37.314 [2024-06-10 16:00:42.736219] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e5ca90 00:21:37.314 [2024-06-10 16:00:42.738267] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:37.314 16:00:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:21:38.251 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:38.251 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:38.251 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:38.251 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:38.251 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:38.510 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:38.510 16:00:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:38.769 "name": "raid_bdev1", 00:21:38.769 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:38.769 "strip_size_kb": 0, 00:21:38.769 "state": "online", 00:21:38.769 "raid_level": "raid1", 00:21:38.769 "superblock": false, 00:21:38.769 "num_base_bdevs": 2, 00:21:38.769 "num_base_bdevs_discovered": 2, 00:21:38.769 "num_base_bdevs_operational": 2, 00:21:38.769 "process": { 00:21:38.769 "type": "rebuild", 00:21:38.769 "target": "spare", 00:21:38.769 "progress": { 00:21:38.769 "blocks": 24576, 00:21:38.769 "percent": 37 00:21:38.769 } 00:21:38.769 }, 00:21:38.769 "base_bdevs_list": [ 00:21:38.769 { 00:21:38.769 "name": "spare", 00:21:38.769 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:38.769 "is_configured": true, 00:21:38.769 "data_offset": 0, 00:21:38.769 "data_size": 65536 00:21:38.769 }, 00:21:38.769 { 00:21:38.769 "name": "BaseBdev2", 00:21:38.769 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:38.769 "is_configured": true, 00:21:38.769 "data_offset": 0, 00:21:38.769 "data_size": 65536 00:21:38.769 } 00:21:38.769 ] 00:21:38.769 }' 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:38.769 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:21:39.028 [2024-06-10 16:00:44.353165] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:39.028 [2024-06-10 16:00:44.451276] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:21:39.028 [2024-06-10 16:00:44.451326] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:39.028 [2024-06-10 16:00:44.451340] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:21:39.028 [2024-06-10 16:00:44.451347] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:39.028 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.288 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:39.288 "name": "raid_bdev1", 00:21:39.288 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:39.288 "strip_size_kb": 0, 00:21:39.288 "state": "online", 00:21:39.288 "raid_level": "raid1", 00:21:39.288 "superblock": false, 00:21:39.288 "num_base_bdevs": 2, 00:21:39.288 "num_base_bdevs_discovered": 1, 00:21:39.288 "num_base_bdevs_operational": 1, 00:21:39.288 "base_bdevs_list": [ 00:21:39.288 { 00:21:39.288 "name": null, 00:21:39.288 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:39.288 "is_configured": false, 00:21:39.288 "data_offset": 0, 00:21:39.288 "data_size": 65536 00:21:39.288 }, 00:21:39.288 { 00:21:39.288 "name": "BaseBdev2", 00:21:39.288 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:39.288 "is_configured": true, 00:21:39.288 "data_offset": 0, 00:21:39.288 "data_size": 65536 00:21:39.288 } 00:21:39.288 ] 00:21:39.288 }' 00:21:39.288 16:00:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:39.288 16:00:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:39.852 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:40.109 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:40.109 "name": "raid_bdev1", 00:21:40.109 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:40.109 "strip_size_kb": 0, 00:21:40.109 "state": "online", 00:21:40.109 "raid_level": "raid1", 00:21:40.109 "superblock": false, 00:21:40.109 "num_base_bdevs": 2, 00:21:40.109 "num_base_bdevs_discovered": 1, 00:21:40.109 "num_base_bdevs_operational": 1, 00:21:40.109 "base_bdevs_list": [ 00:21:40.109 { 00:21:40.109 "name": null, 00:21:40.109 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:40.109 "is_configured": false, 00:21:40.109 "data_offset": 0, 00:21:40.109 "data_size": 65536 00:21:40.109 }, 00:21:40.109 { 00:21:40.109 "name": "BaseBdev2", 00:21:40.109 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:40.109 "is_configured": true, 00:21:40.109 "data_offset": 0, 00:21:40.109 "data_size": 65536 00:21:40.109 } 00:21:40.109 ] 00:21:40.109 }' 00:21:40.109 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:40.367 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:40.367 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:40.367 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:40.367 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:40.627 [2024-06-10 16:00:45.931634] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:40.627 [2024-06-10 16:00:45.936492] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1de31e0 00:21:40.627 [2024-06-10 16:00:45.938014] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:40.627 16:00:45 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.561 16:00:46 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:41.819 "name": "raid_bdev1", 00:21:41.819 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:41.819 "strip_size_kb": 0, 00:21:41.819 "state": "online", 00:21:41.819 "raid_level": "raid1", 00:21:41.819 "superblock": false, 00:21:41.819 "num_base_bdevs": 2, 00:21:41.819 "num_base_bdevs_discovered": 2, 00:21:41.819 "num_base_bdevs_operational": 2, 00:21:41.819 "process": { 00:21:41.819 "type": "rebuild", 00:21:41.819 "target": "spare", 00:21:41.819 "progress": { 00:21:41.819 "blocks": 24576, 00:21:41.819 "percent": 37 00:21:41.819 } 00:21:41.819 }, 00:21:41.819 "base_bdevs_list": [ 00:21:41.819 { 00:21:41.819 "name": "spare", 00:21:41.819 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:41.819 "is_configured": true, 00:21:41.819 "data_offset": 0, 00:21:41.819 "data_size": 65536 00:21:41.819 }, 00:21:41.819 { 00:21:41.819 "name": "BaseBdev2", 00:21:41.819 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:41.819 "is_configured": true, 00:21:41.819 "data_offset": 0, 00:21:41.819 "data_size": 65536 00:21:41.819 } 00:21:41.819 ] 00:21:41.819 }' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=769 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:41.819 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:42.078 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:42.078 "name": "raid_bdev1", 00:21:42.078 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:42.078 "strip_size_kb": 0, 00:21:42.078 "state": "online", 00:21:42.078 "raid_level": "raid1", 00:21:42.078 "superblock": false, 00:21:42.078 "num_base_bdevs": 2, 00:21:42.078 "num_base_bdevs_discovered": 2, 00:21:42.078 "num_base_bdevs_operational": 2, 00:21:42.078 "process": { 00:21:42.078 "type": "rebuild", 00:21:42.078 "target": "spare", 00:21:42.078 "progress": { 00:21:42.078 "blocks": 30720, 00:21:42.078 "percent": 46 00:21:42.078 } 00:21:42.078 }, 00:21:42.078 "base_bdevs_list": [ 00:21:42.078 { 00:21:42.078 "name": "spare", 00:21:42.078 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:42.078 "is_configured": true, 00:21:42.078 "data_offset": 0, 00:21:42.078 "data_size": 65536 00:21:42.078 }, 00:21:42.078 { 00:21:42.078 "name": "BaseBdev2", 00:21:42.078 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:42.078 "is_configured": true, 00:21:42.078 "data_offset": 0, 00:21:42.078 "data_size": 65536 00:21:42.078 } 00:21:42.078 ] 00:21:42.078 }' 00:21:42.078 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:42.336 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:42.336 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:42.336 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:42.336 16:00:47 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:43.270 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:43.271 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:43.529 "name": "raid_bdev1", 00:21:43.529 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:43.529 "strip_size_kb": 0, 00:21:43.529 "state": "online", 00:21:43.529 "raid_level": "raid1", 00:21:43.529 "superblock": false, 00:21:43.529 "num_base_bdevs": 2, 00:21:43.529 "num_base_bdevs_discovered": 2, 00:21:43.529 "num_base_bdevs_operational": 2, 00:21:43.529 "process": { 00:21:43.529 "type": "rebuild", 00:21:43.529 "target": "spare", 00:21:43.529 "progress": { 00:21:43.529 "blocks": 59392, 00:21:43.529 "percent": 90 00:21:43.529 } 00:21:43.529 }, 00:21:43.529 "base_bdevs_list": [ 00:21:43.529 { 00:21:43.529 "name": "spare", 00:21:43.529 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:43.529 "is_configured": true, 00:21:43.529 "data_offset": 0, 00:21:43.529 "data_size": 65536 00:21:43.529 }, 00:21:43.529 { 00:21:43.529 "name": "BaseBdev2", 00:21:43.529 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:43.529 "is_configured": true, 00:21:43.529 "data_offset": 0, 00:21:43.529 "data_size": 65536 00:21:43.529 } 00:21:43.529 ] 00:21:43.529 }' 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:21:43.529 16:00:48 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:21:43.787 [2024-06-10 16:00:49.162276] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:21:43.787 [2024-06-10 16:00:49.162333] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:21:43.787 [2024-06-10 16:00:49.162368] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:21:44.722 16:00:49 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.722 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.722 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:44.980 "name": "raid_bdev1", 00:21:44.980 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:44.980 "strip_size_kb": 0, 00:21:44.980 "state": "online", 00:21:44.980 "raid_level": "raid1", 00:21:44.980 "superblock": false, 00:21:44.980 "num_base_bdevs": 2, 00:21:44.980 "num_base_bdevs_discovered": 2, 00:21:44.980 "num_base_bdevs_operational": 2, 00:21:44.980 "base_bdevs_list": [ 00:21:44.980 { 00:21:44.980 "name": "spare", 00:21:44.980 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:44.980 "is_configured": true, 00:21:44.980 "data_offset": 0, 00:21:44.980 "data_size": 65536 00:21:44.980 }, 00:21:44.980 { 00:21:44.980 "name": "BaseBdev2", 00:21:44.980 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:44.980 "is_configured": true, 00:21:44.980 "data_offset": 0, 00:21:44.980 "data_size": 65536 00:21:44.980 } 00:21:44.980 ] 00:21:44.980 }' 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:44.980 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:21:45.238 "name": "raid_bdev1", 00:21:45.238 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:45.238 "strip_size_kb": 0, 00:21:45.238 "state": "online", 00:21:45.238 "raid_level": "raid1", 00:21:45.238 "superblock": false, 00:21:45.238 "num_base_bdevs": 2, 00:21:45.238 "num_base_bdevs_discovered": 2, 00:21:45.238 "num_base_bdevs_operational": 2, 00:21:45.238 "base_bdevs_list": [ 00:21:45.238 { 00:21:45.238 "name": "spare", 00:21:45.238 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:45.238 "is_configured": true, 00:21:45.238 "data_offset": 0, 00:21:45.238 "data_size": 65536 00:21:45.238 }, 00:21:45.238 { 00:21:45.238 "name": "BaseBdev2", 00:21:45.238 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:45.238 "is_configured": true, 00:21:45.238 "data_offset": 0, 00:21:45.238 "data_size": 65536 00:21:45.238 } 00:21:45.238 ] 00:21:45.238 }' 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:45.238 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:45.496 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:45.496 "name": "raid_bdev1", 00:21:45.496 "uuid": "480213ce-495e-496b-93c6-dfa97b8720a2", 00:21:45.496 "strip_size_kb": 0, 00:21:45.496 "state": "online", 00:21:45.496 "raid_level": "raid1", 00:21:45.496 "superblock": false, 00:21:45.496 "num_base_bdevs": 2, 00:21:45.496 "num_base_bdevs_discovered": 2, 00:21:45.496 "num_base_bdevs_operational": 2, 00:21:45.496 "base_bdevs_list": [ 00:21:45.496 { 00:21:45.496 "name": "spare", 00:21:45.496 "uuid": "dbc02a06-264f-51c1-9d41-55247f396a1b", 00:21:45.496 "is_configured": true, 00:21:45.496 "data_offset": 0, 00:21:45.496 "data_size": 65536 00:21:45.496 }, 00:21:45.496 { 00:21:45.496 "name": "BaseBdev2", 00:21:45.496 "uuid": "3382604f-4fbf-5631-92da-026eb4365e0c", 00:21:45.496 "is_configured": true, 00:21:45.496 "data_offset": 0, 00:21:45.496 "data_size": 65536 00:21:45.496 } 00:21:45.496 ] 00:21:45.496 }' 00:21:45.496 16:00:50 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:45.496 16:00:50 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:46.431 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:21:46.431 [2024-06-10 16:00:51.825861] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:21:46.431 [2024-06-10 16:00:51.825886] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:21:46.431 [2024-06-10 16:00:51.825944] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:21:46.431 [2024-06-10 16:00:51.826005] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:21:46.431 [2024-06-10 16:00:51.826015] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1e59da0 name raid_bdev1, state offline 00:21:46.431 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:46.431 16:00:51 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:46.690 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:21:46.691 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:46.691 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:46.691 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:21:46.950 /dev/nbd0 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:46.950 1+0 records in 00:21:46.950 1+0 records out 00:21:46.950 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000220506 s, 18.6 MB/s 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:46.950 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:21:47.210 /dev/nbd1 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:47.210 1+0 records in 00:21:47.210 1+0 records out 00:21:47.210 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000269978 s, 15.2 MB/s 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:47.210 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:47.469 16:00:52 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:21:47.729 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2762765 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 2762765 ']' 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 2762765 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2762765 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2762765' 00:21:47.730 killing process with pid 2762765 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 2762765 00:21:47.730 Received shutdown signal, test time was about 60.000000 seconds 00:21:47.730 00:21:47.730 Latency(us) 00:21:47.730 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:21:47.730 =================================================================================================================== 00:21:47.730 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:21:47.730 [2024-06-10 16:00:53.091877] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:21:47.730 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 2762765 00:21:47.730 [2024-06-10 16:00:53.116023] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:21:47.989 00:21:47.989 real 0m22.021s 00:21:47.989 user 0m30.871s 00:21:47.989 sys 0m3.809s 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:21:47.989 ************************************ 00:21:47.989 END TEST raid_rebuild_test 00:21:47.989 ************************************ 00:21:47.989 16:00:53 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 2 true false true 00:21:47.989 16:00:53 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:21:47.989 16:00:53 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:21:47.989 16:00:53 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:21:47.989 ************************************ 00:21:47.989 START TEST raid_rebuild_test_sb 00:21:47.989 ************************************ 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2766629 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2766629 /var/tmp/spdk-raid.sock 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2766629 ']' 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:21:47.989 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:21:47.989 16:00:53 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:47.989 [2024-06-10 16:00:53.443735] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:21:47.989 [2024-06-10 16:00:53.443790] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2766629 ] 00:21:47.989 I/O size of 3145728 is greater than zero copy threshold (65536). 00:21:47.989 Zero copy mechanism will not be used. 00:21:48.249 [2024-06-10 16:00:53.543528] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:21:48.249 [2024-06-10 16:00:53.637948] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:21:48.249 [2024-06-10 16:00:53.699520] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:48.249 [2024-06-10 16:00:53.699557] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:21:49.191 16:00:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:21:49.192 16:00:54 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:21:49.192 16:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:49.192 16:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:21:49.192 BaseBdev1_malloc 00:21:49.192 16:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:21:49.450 [2024-06-10 16:00:54.893197] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:21:49.450 [2024-06-10 16:00:54.893242] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.450 [2024-06-10 16:00:54.893262] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217de90 00:21:49.450 [2024-06-10 16:00:54.893271] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.450 [2024-06-10 16:00:54.894993] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.450 [2024-06-10 16:00:54.895020] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:21:49.450 BaseBdev1 00:21:49.450 16:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:21:49.450 16:00:54 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:21:49.709 BaseBdev2_malloc 00:21:49.709 16:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:21:49.968 [2024-06-10 16:00:55.387242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:21:49.968 [2024-06-10 16:00:55.387281] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:49.968 [2024-06-10 16:00:55.387299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x217e9e0 00:21:49.968 [2024-06-10 16:00:55.387309] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:49.968 [2024-06-10 16:00:55.388816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:49.968 [2024-06-10 16:00:55.388845] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:21:49.968 BaseBdev2 00:21:49.968 16:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:21:50.227 spare_malloc 00:21:50.227 16:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:21:50.486 spare_delay 00:21:50.486 16:00:55 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:21:50.745 [2024-06-10 16:00:56.141730] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:21:50.745 [2024-06-10 16:00:56.141770] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:21:50.745 [2024-06-10 16:00:56.141786] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232c890 00:21:50.745 [2024-06-10 16:00:56.141795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:21:50.745 [2024-06-10 16:00:56.143340] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:21:50.745 [2024-06-10 16:00:56.143365] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:21:50.745 spare 00:21:50.745 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:21:51.004 [2024-06-10 16:00:56.394426] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:21:51.004 [2024-06-10 16:00:56.395769] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:21:51.004 [2024-06-10 16:00:56.395927] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232bda0 00:21:51.004 [2024-06-10 16:00:56.395939] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:21:51.004 [2024-06-10 16:00:56.396143] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217c970 00:21:51.004 [2024-06-10 16:00:56.396287] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232bda0 00:21:51.004 [2024-06-10 16:00:56.396296] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232bda0 00:21:51.004 [2024-06-10 16:00:56.396393] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:51.004 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:51.300 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:51.300 "name": "raid_bdev1", 00:21:51.300 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:21:51.300 "strip_size_kb": 0, 00:21:51.300 "state": "online", 00:21:51.300 "raid_level": "raid1", 00:21:51.300 "superblock": true, 00:21:51.300 "num_base_bdevs": 2, 00:21:51.300 "num_base_bdevs_discovered": 2, 00:21:51.300 "num_base_bdevs_operational": 2, 00:21:51.300 "base_bdevs_list": [ 00:21:51.300 { 00:21:51.300 "name": "BaseBdev1", 00:21:51.300 "uuid": "3a2a5d09-c0e3-5069-b07c-936fb804daed", 00:21:51.300 "is_configured": true, 00:21:51.300 "data_offset": 2048, 00:21:51.300 "data_size": 63488 00:21:51.300 }, 00:21:51.300 { 00:21:51.300 "name": "BaseBdev2", 00:21:51.300 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:21:51.300 "is_configured": true, 00:21:51.300 "data_offset": 2048, 00:21:51.300 "data_size": 63488 00:21:51.300 } 00:21:51.300 ] 00:21:51.300 }' 00:21:51.300 16:00:56 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:51.300 16:00:56 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:51.866 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:21:51.866 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:21:52.125 [2024-06-10 16:00:57.497601] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:21:52.125 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:21:52.125 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:52.125 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:52.383 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:52.384 16:00:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:21:52.642 [2024-06-10 16:00:58.002771] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x232e930 00:21:52.642 /dev/nbd0 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:21:52.642 1+0 records in 00:21:52.642 1+0 records out 00:21:52.642 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000225183 s, 18.2 MB/s 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:21:52.642 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:21:52.643 16:00:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:21:57.924 63488+0 records in 00:21:57.924 63488+0 records out 00:21:57.924 32505856 bytes (33 MB, 31 MiB) copied, 4.99126 s, 6.5 MB/s 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:21:57.924 [2024-06-10 16:01:03.237819] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:21:57.924 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:21:58.184 [2024-06-10 16:01:03.478507] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:21:58.184 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:21:58.442 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:21:58.442 "name": "raid_bdev1", 00:21:58.442 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:21:58.442 "strip_size_kb": 0, 00:21:58.442 "state": "online", 00:21:58.442 "raid_level": "raid1", 00:21:58.442 "superblock": true, 00:21:58.442 "num_base_bdevs": 2, 00:21:58.442 "num_base_bdevs_discovered": 1, 00:21:58.442 "num_base_bdevs_operational": 1, 00:21:58.442 "base_bdevs_list": [ 00:21:58.442 { 00:21:58.442 "name": null, 00:21:58.442 "uuid": "00000000-0000-0000-0000-000000000000", 00:21:58.442 "is_configured": false, 00:21:58.442 "data_offset": 2048, 00:21:58.442 "data_size": 63488 00:21:58.443 }, 00:21:58.443 { 00:21:58.443 "name": "BaseBdev2", 00:21:58.443 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:21:58.443 "is_configured": true, 00:21:58.443 "data_offset": 2048, 00:21:58.443 "data_size": 63488 00:21:58.443 } 00:21:58.443 ] 00:21:58.443 }' 00:21:58.443 16:01:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:21:58.443 16:01:03 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:21:59.010 16:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:21:59.268 [2024-06-10 16:01:04.557411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:21:59.268 [2024-06-10 16:01:04.562211] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x217d570 00:21:59.268 [2024-06-10 16:01:04.564270] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:21:59.268 16:01:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.206 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.464 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:00.464 "name": "raid_bdev1", 00:22:00.464 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:00.464 "strip_size_kb": 0, 00:22:00.464 "state": "online", 00:22:00.464 "raid_level": "raid1", 00:22:00.464 "superblock": true, 00:22:00.464 "num_base_bdevs": 2, 00:22:00.464 "num_base_bdevs_discovered": 2, 00:22:00.464 "num_base_bdevs_operational": 2, 00:22:00.464 "process": { 00:22:00.464 "type": "rebuild", 00:22:00.464 "target": "spare", 00:22:00.464 "progress": { 00:22:00.464 "blocks": 24576, 00:22:00.464 "percent": 38 00:22:00.465 } 00:22:00.465 }, 00:22:00.465 "base_bdevs_list": [ 00:22:00.465 { 00:22:00.465 "name": "spare", 00:22:00.465 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:00.465 "is_configured": true, 00:22:00.465 "data_offset": 2048, 00:22:00.465 "data_size": 63488 00:22:00.465 }, 00:22:00.465 { 00:22:00.465 "name": "BaseBdev2", 00:22:00.465 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:00.465 "is_configured": true, 00:22:00.465 "data_offset": 2048, 00:22:00.465 "data_size": 63488 00:22:00.465 } 00:22:00.465 ] 00:22:00.465 }' 00:22:00.465 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:00.465 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:00.465 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:00.465 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:00.465 16:01:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:00.723 [2024-06-10 16:01:06.170686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:00.723 [2024-06-10 16:01:06.176509] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:00.723 [2024-06-10 16:01:06.176554] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:00.723 [2024-06-10 16:01:06.176568] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:00.723 [2024-06-10 16:01:06.176574] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:00.723 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:00.981 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:00.981 "name": "raid_bdev1", 00:22:00.981 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:00.981 "strip_size_kb": 0, 00:22:00.981 "state": "online", 00:22:00.981 "raid_level": "raid1", 00:22:00.981 "superblock": true, 00:22:00.981 "num_base_bdevs": 2, 00:22:00.981 "num_base_bdevs_discovered": 1, 00:22:00.981 "num_base_bdevs_operational": 1, 00:22:00.981 "base_bdevs_list": [ 00:22:00.981 { 00:22:00.981 "name": null, 00:22:00.981 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:00.981 "is_configured": false, 00:22:00.981 "data_offset": 2048, 00:22:00.981 "data_size": 63488 00:22:00.981 }, 00:22:00.981 { 00:22:00.981 "name": "BaseBdev2", 00:22:00.981 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:00.981 "is_configured": true, 00:22:00.981 "data_offset": 2048, 00:22:00.981 "data_size": 63488 00:22:00.981 } 00:22:00.981 ] 00:22:00.981 }' 00:22:00.981 16:01:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:00.981 16:01:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:01.916 "name": "raid_bdev1", 00:22:01.916 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:01.916 "strip_size_kb": 0, 00:22:01.916 "state": "online", 00:22:01.916 "raid_level": "raid1", 00:22:01.916 "superblock": true, 00:22:01.916 "num_base_bdevs": 2, 00:22:01.916 "num_base_bdevs_discovered": 1, 00:22:01.916 "num_base_bdevs_operational": 1, 00:22:01.916 "base_bdevs_list": [ 00:22:01.916 { 00:22:01.916 "name": null, 00:22:01.916 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:01.916 "is_configured": false, 00:22:01.916 "data_offset": 2048, 00:22:01.916 "data_size": 63488 00:22:01.916 }, 00:22:01.916 { 00:22:01.916 "name": "BaseBdev2", 00:22:01.916 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:01.916 "is_configured": true, 00:22:01.916 "data_offset": 2048, 00:22:01.916 "data_size": 63488 00:22:01.916 } 00:22:01.916 ] 00:22:01.916 }' 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:01.916 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:02.175 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:02.175 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:02.175 [2024-06-10 16:01:07.660951] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:02.175 [2024-06-10 16:01:07.665737] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x232ceb0 00:22:02.175 [2024-06-10 16:01:07.667246] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:02.175 16:01:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.550 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:03.550 "name": "raid_bdev1", 00:22:03.550 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:03.550 "strip_size_kb": 0, 00:22:03.550 "state": "online", 00:22:03.550 "raid_level": "raid1", 00:22:03.550 "superblock": true, 00:22:03.550 "num_base_bdevs": 2, 00:22:03.550 "num_base_bdevs_discovered": 2, 00:22:03.550 "num_base_bdevs_operational": 2, 00:22:03.550 "process": { 00:22:03.550 "type": "rebuild", 00:22:03.550 "target": "spare", 00:22:03.550 "progress": { 00:22:03.550 "blocks": 24576, 00:22:03.550 "percent": 38 00:22:03.550 } 00:22:03.550 }, 00:22:03.550 "base_bdevs_list": [ 00:22:03.550 { 00:22:03.550 "name": "spare", 00:22:03.550 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:03.550 "is_configured": true, 00:22:03.550 "data_offset": 2048, 00:22:03.550 "data_size": 63488 00:22:03.550 }, 00:22:03.550 { 00:22:03.551 "name": "BaseBdev2", 00:22:03.551 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:03.551 "is_configured": true, 00:22:03.551 "data_offset": 2048, 00:22:03.551 "data_size": 63488 00:22:03.551 } 00:22:03.551 ] 00:22:03.551 }' 00:22:03.551 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:03.551 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:03.551 16:01:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:03.551 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=791 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:03.551 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:03.809 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:03.809 "name": "raid_bdev1", 00:22:03.809 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:03.809 "strip_size_kb": 0, 00:22:03.809 "state": "online", 00:22:03.809 "raid_level": "raid1", 00:22:03.809 "superblock": true, 00:22:03.809 "num_base_bdevs": 2, 00:22:03.809 "num_base_bdevs_discovered": 2, 00:22:03.809 "num_base_bdevs_operational": 2, 00:22:03.809 "process": { 00:22:03.809 "type": "rebuild", 00:22:03.809 "target": "spare", 00:22:03.809 "progress": { 00:22:03.809 "blocks": 30720, 00:22:03.809 "percent": 48 00:22:03.809 } 00:22:03.809 }, 00:22:03.809 "base_bdevs_list": [ 00:22:03.809 { 00:22:03.809 "name": "spare", 00:22:03.809 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:03.809 "is_configured": true, 00:22:03.809 "data_offset": 2048, 00:22:03.809 "data_size": 63488 00:22:03.809 }, 00:22:03.809 { 00:22:03.809 "name": "BaseBdev2", 00:22:03.809 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:03.809 "is_configured": true, 00:22:03.809 "data_offset": 2048, 00:22:03.809 "data_size": 63488 00:22:03.809 } 00:22:03.809 ] 00:22:03.809 }' 00:22:03.809 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:03.809 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:03.809 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:04.067 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:04.067 16:01:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:05.003 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:05.262 "name": "raid_bdev1", 00:22:05.262 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:05.262 "strip_size_kb": 0, 00:22:05.262 "state": "online", 00:22:05.262 "raid_level": "raid1", 00:22:05.262 "superblock": true, 00:22:05.262 "num_base_bdevs": 2, 00:22:05.262 "num_base_bdevs_discovered": 2, 00:22:05.262 "num_base_bdevs_operational": 2, 00:22:05.262 "process": { 00:22:05.262 "type": "rebuild", 00:22:05.262 "target": "spare", 00:22:05.262 "progress": { 00:22:05.262 "blocks": 57344, 00:22:05.262 "percent": 90 00:22:05.262 } 00:22:05.262 }, 00:22:05.262 "base_bdevs_list": [ 00:22:05.262 { 00:22:05.262 "name": "spare", 00:22:05.262 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:05.262 "is_configured": true, 00:22:05.262 "data_offset": 2048, 00:22:05.262 "data_size": 63488 00:22:05.262 }, 00:22:05.262 { 00:22:05.262 "name": "BaseBdev2", 00:22:05.262 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:05.262 "is_configured": true, 00:22:05.262 "data_offset": 2048, 00:22:05.262 "data_size": 63488 00:22:05.262 } 00:22:05.262 ] 00:22:05.262 }' 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:05.262 16:01:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:05.521 [2024-06-10 16:01:10.790721] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:05.521 [2024-06-10 16:01:10.790778] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:05.521 [2024-06-10 16:01:10.790855] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:06.457 "name": "raid_bdev1", 00:22:06.457 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:06.457 "strip_size_kb": 0, 00:22:06.457 "state": "online", 00:22:06.457 "raid_level": "raid1", 00:22:06.457 "superblock": true, 00:22:06.457 "num_base_bdevs": 2, 00:22:06.457 "num_base_bdevs_discovered": 2, 00:22:06.457 "num_base_bdevs_operational": 2, 00:22:06.457 "base_bdevs_list": [ 00:22:06.457 { 00:22:06.457 "name": "spare", 00:22:06.457 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:06.457 "is_configured": true, 00:22:06.457 "data_offset": 2048, 00:22:06.457 "data_size": 63488 00:22:06.457 }, 00:22:06.457 { 00:22:06.457 "name": "BaseBdev2", 00:22:06.457 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:06.457 "is_configured": true, 00:22:06.457 "data_offset": 2048, 00:22:06.457 "data_size": 63488 00:22:06.457 } 00:22:06.457 ] 00:22:06.457 }' 00:22:06.457 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:06.716 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:06.716 16:01:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:06.716 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:06.975 "name": "raid_bdev1", 00:22:06.975 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:06.975 "strip_size_kb": 0, 00:22:06.975 "state": "online", 00:22:06.975 "raid_level": "raid1", 00:22:06.975 "superblock": true, 00:22:06.975 "num_base_bdevs": 2, 00:22:06.975 "num_base_bdevs_discovered": 2, 00:22:06.975 "num_base_bdevs_operational": 2, 00:22:06.975 "base_bdevs_list": [ 00:22:06.975 { 00:22:06.975 "name": "spare", 00:22:06.975 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:06.975 "is_configured": true, 00:22:06.975 "data_offset": 2048, 00:22:06.975 "data_size": 63488 00:22:06.975 }, 00:22:06.975 { 00:22:06.975 "name": "BaseBdev2", 00:22:06.975 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:06.975 "is_configured": true, 00:22:06.975 "data_offset": 2048, 00:22:06.975 "data_size": 63488 00:22:06.975 } 00:22:06.975 ] 00:22:06.975 }' 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:06.975 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:07.233 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:07.233 "name": "raid_bdev1", 00:22:07.233 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:07.233 "strip_size_kb": 0, 00:22:07.233 "state": "online", 00:22:07.233 "raid_level": "raid1", 00:22:07.233 "superblock": true, 00:22:07.233 "num_base_bdevs": 2, 00:22:07.233 "num_base_bdevs_discovered": 2, 00:22:07.233 "num_base_bdevs_operational": 2, 00:22:07.233 "base_bdevs_list": [ 00:22:07.233 { 00:22:07.233 "name": "spare", 00:22:07.233 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:07.233 "is_configured": true, 00:22:07.233 "data_offset": 2048, 00:22:07.233 "data_size": 63488 00:22:07.233 }, 00:22:07.233 { 00:22:07.233 "name": "BaseBdev2", 00:22:07.233 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:07.233 "is_configured": true, 00:22:07.233 "data_offset": 2048, 00:22:07.233 "data_size": 63488 00:22:07.233 } 00:22:07.233 ] 00:22:07.233 }' 00:22:07.233 16:01:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:07.233 16:01:12 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:07.800 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:08.059 [2024-06-10 16:01:13.430433] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:08.059 [2024-06-10 16:01:13.430456] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:08.059 [2024-06-10 16:01:13.430513] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:08.059 [2024-06-10 16:01:13.430569] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:08.059 [2024-06-10 16:01:13.430579] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232bda0 name raid_bdev1, state offline 00:22:08.059 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:08.059 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:08.317 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:22:08.576 /dev/nbd0 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:08.576 16:01:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:08.576 1+0 records in 00:22:08.576 1+0 records out 00:22:08.576 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239542 s, 17.1 MB/s 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:08.576 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:22:08.834 /dev/nbd1 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:08.834 1+0 records in 00:22:08.834 1+0 records out 00:22:08.834 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000232267 s, 17.6 MB/s 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:22:08.834 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:22:09.091 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:22:09.091 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:09.091 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:22:09.092 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:09.092 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:22:09.092 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:09.092 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:22:09.351 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:09.610 16:01:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:09.870 [2024-06-10 16:01:15.231364] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:09.870 [2024-06-10 16:01:15.231412] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:09.870 [2024-06-10 16:01:15.231428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2329700 00:22:09.870 [2024-06-10 16:01:15.231439] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:09.870 [2024-06-10 16:01:15.233167] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:09.870 [2024-06-10 16:01:15.233193] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:09.870 [2024-06-10 16:01:15.233268] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:09.870 [2024-06-10 16:01:15.233293] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:09.870 [2024-06-10 16:01:15.233397] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:09.870 spare 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:09.870 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:09.870 [2024-06-10 16:01:15.333710] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x232d500 00:22:09.870 [2024-06-10 16:01:15.333723] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:09.870 [2024-06-10 16:01:15.333918] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2328040 00:22:09.870 [2024-06-10 16:01:15.334078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x232d500 00:22:09.870 [2024-06-10 16:01:15.334088] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x232d500 00:22:09.870 [2024-06-10 16:01:15.334194] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:10.129 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:10.129 "name": "raid_bdev1", 00:22:10.129 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:10.129 "strip_size_kb": 0, 00:22:10.129 "state": "online", 00:22:10.129 "raid_level": "raid1", 00:22:10.129 "superblock": true, 00:22:10.129 "num_base_bdevs": 2, 00:22:10.129 "num_base_bdevs_discovered": 2, 00:22:10.129 "num_base_bdevs_operational": 2, 00:22:10.129 "base_bdevs_list": [ 00:22:10.129 { 00:22:10.129 "name": "spare", 00:22:10.129 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:10.129 "is_configured": true, 00:22:10.129 "data_offset": 2048, 00:22:10.129 "data_size": 63488 00:22:10.129 }, 00:22:10.129 { 00:22:10.129 "name": "BaseBdev2", 00:22:10.129 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:10.129 "is_configured": true, 00:22:10.129 "data_offset": 2048, 00:22:10.129 "data_size": 63488 00:22:10.129 } 00:22:10.129 ] 00:22:10.129 }' 00:22:10.129 16:01:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:10.129 16:01:15 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.696 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:10.954 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:10.955 "name": "raid_bdev1", 00:22:10.955 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:10.955 "strip_size_kb": 0, 00:22:10.955 "state": "online", 00:22:10.955 "raid_level": "raid1", 00:22:10.955 "superblock": true, 00:22:10.955 "num_base_bdevs": 2, 00:22:10.955 "num_base_bdevs_discovered": 2, 00:22:10.955 "num_base_bdevs_operational": 2, 00:22:10.955 "base_bdevs_list": [ 00:22:10.955 { 00:22:10.955 "name": "spare", 00:22:10.955 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:10.955 "is_configured": true, 00:22:10.955 "data_offset": 2048, 00:22:10.955 "data_size": 63488 00:22:10.955 }, 00:22:10.955 { 00:22:10.955 "name": "BaseBdev2", 00:22:10.955 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:10.955 "is_configured": true, 00:22:10.955 "data_offset": 2048, 00:22:10.955 "data_size": 63488 00:22:10.955 } 00:22:10.955 ] 00:22:10.955 }' 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:10.955 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:22:11.212 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:22:11.212 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:11.472 [2024-06-10 16:01:16.899931] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:11.472 16:01:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:11.797 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:11.797 "name": "raid_bdev1", 00:22:11.797 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:11.797 "strip_size_kb": 0, 00:22:11.797 "state": "online", 00:22:11.797 "raid_level": "raid1", 00:22:11.797 "superblock": true, 00:22:11.797 "num_base_bdevs": 2, 00:22:11.797 "num_base_bdevs_discovered": 1, 00:22:11.797 "num_base_bdevs_operational": 1, 00:22:11.797 "base_bdevs_list": [ 00:22:11.797 { 00:22:11.797 "name": null, 00:22:11.797 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:11.797 "is_configured": false, 00:22:11.797 "data_offset": 2048, 00:22:11.797 "data_size": 63488 00:22:11.797 }, 00:22:11.797 { 00:22:11.797 "name": "BaseBdev2", 00:22:11.797 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:11.797 "is_configured": true, 00:22:11.797 "data_offset": 2048, 00:22:11.797 "data_size": 63488 00:22:11.797 } 00:22:11.797 ] 00:22:11.797 }' 00:22:11.797 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:11.797 16:01:17 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:12.364 16:01:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:12.622 [2024-06-10 16:01:18.035147] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:12.622 [2024-06-10 16:01:18.035293] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:12.622 [2024-06-10 16:01:18.035307] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:12.622 [2024-06-10 16:01:18.035331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:12.622 [2024-06-10 16:01:18.040019] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2174b50 00:22:12.622 [2024-06-10 16:01:18.042175] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:12.622 16:01:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:22:13.557 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:13.557 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:13.557 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:13.557 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:13.557 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:13.815 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:13.815 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:14.073 "name": "raid_bdev1", 00:22:14.073 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:14.073 "strip_size_kb": 0, 00:22:14.073 "state": "online", 00:22:14.073 "raid_level": "raid1", 00:22:14.073 "superblock": true, 00:22:14.073 "num_base_bdevs": 2, 00:22:14.073 "num_base_bdevs_discovered": 2, 00:22:14.073 "num_base_bdevs_operational": 2, 00:22:14.073 "process": { 00:22:14.073 "type": "rebuild", 00:22:14.073 "target": "spare", 00:22:14.073 "progress": { 00:22:14.073 "blocks": 24576, 00:22:14.073 "percent": 38 00:22:14.073 } 00:22:14.073 }, 00:22:14.073 "base_bdevs_list": [ 00:22:14.073 { 00:22:14.073 "name": "spare", 00:22:14.073 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:14.073 "is_configured": true, 00:22:14.073 "data_offset": 2048, 00:22:14.073 "data_size": 63488 00:22:14.073 }, 00:22:14.073 { 00:22:14.073 "name": "BaseBdev2", 00:22:14.073 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:14.073 "is_configured": true, 00:22:14.073 "data_offset": 2048, 00:22:14.073 "data_size": 63488 00:22:14.073 } 00:22:14.073 ] 00:22:14.073 }' 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:14.073 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:14.332 [2024-06-10 16:01:19.661063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:14.332 [2024-06-10 16:01:19.755097] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:14.332 [2024-06-10 16:01:19.755146] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:14.332 [2024-06-10 16:01:19.755161] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:14.332 [2024-06-10 16:01:19.755172] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:14.332 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:14.333 16:01:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:14.591 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:14.591 "name": "raid_bdev1", 00:22:14.591 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:14.591 "strip_size_kb": 0, 00:22:14.591 "state": "online", 00:22:14.591 "raid_level": "raid1", 00:22:14.591 "superblock": true, 00:22:14.591 "num_base_bdevs": 2, 00:22:14.591 "num_base_bdevs_discovered": 1, 00:22:14.591 "num_base_bdevs_operational": 1, 00:22:14.591 "base_bdevs_list": [ 00:22:14.591 { 00:22:14.591 "name": null, 00:22:14.591 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:14.591 "is_configured": false, 00:22:14.591 "data_offset": 2048, 00:22:14.591 "data_size": 63488 00:22:14.591 }, 00:22:14.591 { 00:22:14.591 "name": "BaseBdev2", 00:22:14.591 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:14.591 "is_configured": true, 00:22:14.591 "data_offset": 2048, 00:22:14.591 "data_size": 63488 00:22:14.591 } 00:22:14.591 ] 00:22:14.591 }' 00:22:14.591 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:14.591 16:01:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:15.158 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:15.416 [2024-06-10 16:01:20.894543] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:15.416 [2024-06-10 16:01:20.894593] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:15.416 [2024-06-10 16:01:20.894613] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x232bb20 00:22:15.416 [2024-06-10 16:01:20.894622] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:15.416 [2024-06-10 16:01:20.895010] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:15.416 [2024-06-10 16:01:20.895027] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:15.416 [2024-06-10 16:01:20.895103] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:22:15.416 [2024-06-10 16:01:20.895114] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:22:15.416 [2024-06-10 16:01:20.895122] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:22:15.416 [2024-06-10 16:01:20.895138] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:15.416 [2024-06-10 16:01:20.899823] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2328560 00:22:15.416 spare 00:22:15.416 [2024-06-10 16:01:20.901408] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:15.416 16:01:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:16.790 16:01:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:16.790 "name": "raid_bdev1", 00:22:16.790 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:16.790 "strip_size_kb": 0, 00:22:16.790 "state": "online", 00:22:16.790 "raid_level": "raid1", 00:22:16.790 "superblock": true, 00:22:16.790 "num_base_bdevs": 2, 00:22:16.790 "num_base_bdevs_discovered": 2, 00:22:16.790 "num_base_bdevs_operational": 2, 00:22:16.790 "process": { 00:22:16.790 "type": "rebuild", 00:22:16.790 "target": "spare", 00:22:16.790 "progress": { 00:22:16.790 "blocks": 24576, 00:22:16.790 "percent": 38 00:22:16.790 } 00:22:16.790 }, 00:22:16.790 "base_bdevs_list": [ 00:22:16.790 { 00:22:16.790 "name": "spare", 00:22:16.790 "uuid": "93758833-785c-52c7-a815-a784a1eda059", 00:22:16.790 "is_configured": true, 00:22:16.790 "data_offset": 2048, 00:22:16.790 "data_size": 63488 00:22:16.790 }, 00:22:16.790 { 00:22:16.790 "name": "BaseBdev2", 00:22:16.790 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:16.790 "is_configured": true, 00:22:16.790 "data_offset": 2048, 00:22:16.790 "data_size": 63488 00:22:16.790 } 00:22:16.790 ] 00:22:16.790 }' 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:16.790 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:22:17.048 [2024-06-10 16:01:22.508340] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:17.048 [2024-06-10 16:01:22.513626] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:17.048 [2024-06-10 16:01:22.513669] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:17.048 [2024-06-10 16:01:22.513684] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:17.048 [2024-06-10 16:01:22.513690] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:17.048 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:17.048 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:17.048 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:17.048 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:17.049 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:17.307 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:17.307 "name": "raid_bdev1", 00:22:17.307 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:17.307 "strip_size_kb": 0, 00:22:17.307 "state": "online", 00:22:17.307 "raid_level": "raid1", 00:22:17.307 "superblock": true, 00:22:17.307 "num_base_bdevs": 2, 00:22:17.307 "num_base_bdevs_discovered": 1, 00:22:17.307 "num_base_bdevs_operational": 1, 00:22:17.307 "base_bdevs_list": [ 00:22:17.307 { 00:22:17.307 "name": null, 00:22:17.307 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:17.307 "is_configured": false, 00:22:17.307 "data_offset": 2048, 00:22:17.307 "data_size": 63488 00:22:17.307 }, 00:22:17.307 { 00:22:17.307 "name": "BaseBdev2", 00:22:17.307 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:17.307 "is_configured": true, 00:22:17.307 "data_offset": 2048, 00:22:17.307 "data_size": 63488 00:22:17.307 } 00:22:17.307 ] 00:22:17.307 }' 00:22:17.307 16:01:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:17.307 16:01:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:18.244 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:18.244 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:18.245 "name": "raid_bdev1", 00:22:18.245 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:18.245 "strip_size_kb": 0, 00:22:18.245 "state": "online", 00:22:18.245 "raid_level": "raid1", 00:22:18.245 "superblock": true, 00:22:18.245 "num_base_bdevs": 2, 00:22:18.245 "num_base_bdevs_discovered": 1, 00:22:18.245 "num_base_bdevs_operational": 1, 00:22:18.245 "base_bdevs_list": [ 00:22:18.245 { 00:22:18.245 "name": null, 00:22:18.245 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:18.245 "is_configured": false, 00:22:18.245 "data_offset": 2048, 00:22:18.245 "data_size": 63488 00:22:18.245 }, 00:22:18.245 { 00:22:18.245 "name": "BaseBdev2", 00:22:18.245 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:18.245 "is_configured": true, 00:22:18.245 "data_offset": 2048, 00:22:18.245 "data_size": 63488 00:22:18.245 } 00:22:18.245 ] 00:22:18.245 }' 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:18.245 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:18.504 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:18.504 16:01:23 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:22:18.763 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:18.763 [2024-06-10 16:01:24.250715] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:18.763 [2024-06-10 16:01:24.250756] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:18.763 [2024-06-10 16:01:24.250773] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2174710 00:22:18.763 [2024-06-10 16:01:24.250782] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:18.763 [2024-06-10 16:01:24.251140] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:18.763 [2024-06-10 16:01:24.251156] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:18.763 [2024-06-10 16:01:24.251217] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:22:18.763 [2024-06-10 16:01:24.251227] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:18.763 [2024-06-10 16:01:24.251234] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:18.763 BaseBdev1 00:22:18.763 16:01:24 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:20.140 "name": "raid_bdev1", 00:22:20.140 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:20.140 "strip_size_kb": 0, 00:22:20.140 "state": "online", 00:22:20.140 "raid_level": "raid1", 00:22:20.140 "superblock": true, 00:22:20.140 "num_base_bdevs": 2, 00:22:20.140 "num_base_bdevs_discovered": 1, 00:22:20.140 "num_base_bdevs_operational": 1, 00:22:20.140 "base_bdevs_list": [ 00:22:20.140 { 00:22:20.140 "name": null, 00:22:20.140 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.140 "is_configured": false, 00:22:20.140 "data_offset": 2048, 00:22:20.140 "data_size": 63488 00:22:20.140 }, 00:22:20.140 { 00:22:20.140 "name": "BaseBdev2", 00:22:20.140 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:20.140 "is_configured": true, 00:22:20.140 "data_offset": 2048, 00:22:20.140 "data_size": 63488 00:22:20.140 } 00:22:20.140 ] 00:22:20.140 }' 00:22:20.140 16:01:25 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:20.141 16:01:25 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:20.706 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:20.965 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:20.965 "name": "raid_bdev1", 00:22:20.965 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:20.965 "strip_size_kb": 0, 00:22:20.965 "state": "online", 00:22:20.965 "raid_level": "raid1", 00:22:20.965 "superblock": true, 00:22:20.965 "num_base_bdevs": 2, 00:22:20.965 "num_base_bdevs_discovered": 1, 00:22:20.965 "num_base_bdevs_operational": 1, 00:22:20.965 "base_bdevs_list": [ 00:22:20.965 { 00:22:20.965 "name": null, 00:22:20.965 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:20.965 "is_configured": false, 00:22:20.965 "data_offset": 2048, 00:22:20.965 "data_size": 63488 00:22:20.965 }, 00:22:20.965 { 00:22:20.965 "name": "BaseBdev2", 00:22:20.965 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:20.965 "is_configured": true, 00:22:20.965 "data_offset": 2048, 00:22:20.965 "data_size": 63488 00:22:20.965 } 00:22:20.965 ] 00:22:20.965 }' 00:22:20.965 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:20.965 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:20.965 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:22:21.224 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:22:21.483 [2024-06-10 16:01:26.737386] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:21.483 [2024-06-10 16:01:26.737505] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:22:21.483 [2024-06-10 16:01:26.737517] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:22:21.483 request: 00:22:21.483 { 00:22:21.483 "raid_bdev": "raid_bdev1", 00:22:21.483 "base_bdev": "BaseBdev1", 00:22:21.483 "method": "bdev_raid_add_base_bdev", 00:22:21.483 "req_id": 1 00:22:21.483 } 00:22:21.483 Got JSON-RPC error response 00:22:21.483 response: 00:22:21.483 { 00:22:21.483 "code": -22, 00:22:21.483 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:22:21.483 } 00:22:21.483 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:22:21.483 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:22:21.483 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:22:21.483 16:01:26 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:22:21.483 16:01:26 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:22.418 16:01:27 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:22.677 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:22.677 "name": "raid_bdev1", 00:22:22.677 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:22.677 "strip_size_kb": 0, 00:22:22.677 "state": "online", 00:22:22.677 "raid_level": "raid1", 00:22:22.677 "superblock": true, 00:22:22.677 "num_base_bdevs": 2, 00:22:22.677 "num_base_bdevs_discovered": 1, 00:22:22.677 "num_base_bdevs_operational": 1, 00:22:22.677 "base_bdevs_list": [ 00:22:22.677 { 00:22:22.677 "name": null, 00:22:22.677 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:22.677 "is_configured": false, 00:22:22.677 "data_offset": 2048, 00:22:22.677 "data_size": 63488 00:22:22.677 }, 00:22:22.677 { 00:22:22.677 "name": "BaseBdev2", 00:22:22.677 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:22.677 "is_configured": true, 00:22:22.677 "data_offset": 2048, 00:22:22.677 "data_size": 63488 00:22:22.677 } 00:22:22.677 ] 00:22:22.677 }' 00:22:22.677 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:22.677 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:23.245 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:23.503 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:23.503 "name": "raid_bdev1", 00:22:23.503 "uuid": "714b7318-2990-4eed-a79e-dd4ebbaab62b", 00:22:23.503 "strip_size_kb": 0, 00:22:23.503 "state": "online", 00:22:23.503 "raid_level": "raid1", 00:22:23.503 "superblock": true, 00:22:23.503 "num_base_bdevs": 2, 00:22:23.503 "num_base_bdevs_discovered": 1, 00:22:23.503 "num_base_bdevs_operational": 1, 00:22:23.503 "base_bdevs_list": [ 00:22:23.503 { 00:22:23.503 "name": null, 00:22:23.503 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:23.503 "is_configured": false, 00:22:23.503 "data_offset": 2048, 00:22:23.503 "data_size": 63488 00:22:23.503 }, 00:22:23.503 { 00:22:23.503 "name": "BaseBdev2", 00:22:23.503 "uuid": "8a062f5c-cebc-5533-92de-436d40ac9642", 00:22:23.503 "is_configured": true, 00:22:23.503 "data_offset": 2048, 00:22:23.503 "data_size": 63488 00:22:23.503 } 00:22:23.503 ] 00:22:23.503 }' 00:22:23.503 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:23.503 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:23.503 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:23.503 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2766629 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2766629 ']' 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 2766629 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:23.504 16:01:28 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2766629 00:22:23.762 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:23.762 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:23.762 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2766629' 00:22:23.762 killing process with pid 2766629 00:22:23.762 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 2766629 00:22:23.762 Received shutdown signal, test time was about 60.000000 seconds 00:22:23.762 00:22:23.762 Latency(us) 00:22:23.763 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:23.763 =================================================================================================================== 00:22:23.763 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:22:23.763 [2024-06-10 16:01:29.035591] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:23.763 [2024-06-10 16:01:29.035680] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:23.763 [2024-06-10 16:01:29.035723] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:23.763 [2024-06-10 16:01:29.035733] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x232d500 name raid_bdev1, state offline 00:22:23.763 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 2766629 00:22:23.763 [2024-06-10 16:01:29.060552] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:23.763 16:01:29 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:22:23.763 00:22:23.763 real 0m35.878s 00:22:23.763 user 0m53.829s 00:22:23.763 sys 0m5.064s 00:22:23.763 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:23.763 16:01:29 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:22:23.763 ************************************ 00:22:23.763 END TEST raid_rebuild_test_sb 00:22:23.763 ************************************ 00:22:24.022 16:01:29 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 2 false true true 00:22:24.022 16:01:29 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:24.022 16:01:29 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:24.022 16:01:29 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:24.022 ************************************ 00:22:24.022 START TEST raid_rebuild_test_io 00:22:24.022 ************************************ 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 false true true 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2772969 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2772969 /var/tmp/spdk-raid.sock 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 2772969 ']' 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:24.022 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:24.022 16:01:29 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:24.022 [2024-06-10 16:01:29.396295] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:22:24.022 [2024-06-10 16:01:29.396349] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2772969 ] 00:22:24.022 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:24.022 Zero copy mechanism will not be used. 00:22:24.022 [2024-06-10 16:01:29.492768] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:24.282 [2024-06-10 16:01:29.588140] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:24.282 [2024-06-10 16:01:29.646121] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.282 [2024-06-10 16:01:29.646160] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:24.849 16:01:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:24.849 16:01:30 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:22:24.849 16:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:24.849 16:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:25.108 BaseBdev1_malloc 00:22:25.108 16:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:25.366 [2024-06-10 16:01:30.848451] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:25.366 [2024-06-10 16:01:30.848495] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.366 [2024-06-10 16:01:30.848514] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0ee90 00:22:25.366 [2024-06-10 16:01:30.848524] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.366 [2024-06-10 16:01:30.850234] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.366 [2024-06-10 16:01:30.850261] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:25.366 BaseBdev1 00:22:25.366 16:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:25.366 16:01:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:25.626 BaseBdev2_malloc 00:22:25.626 16:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:25.934 [2024-06-10 16:01:31.362470] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:25.934 [2024-06-10 16:01:31.362511] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:25.934 [2024-06-10 16:01:31.362531] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc0f9e0 00:22:25.934 [2024-06-10 16:01:31.362540] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:25.934 [2024-06-10 16:01:31.364086] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:25.934 [2024-06-10 16:01:31.364113] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:25.934 BaseBdev2 00:22:25.934 16:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:26.226 spare_malloc 00:22:26.226 16:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:26.485 spare_delay 00:22:26.485 16:01:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:26.751 [2024-06-10 16:01:32.133077] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:26.751 [2024-06-10 16:01:32.133118] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:26.751 [2024-06-10 16:01:32.133134] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xdbd890 00:22:26.751 [2024-06-10 16:01:32.133143] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:26.751 [2024-06-10 16:01:32.134741] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:26.751 [2024-06-10 16:01:32.134769] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:26.751 spare 00:22:26.751 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:27.009 [2024-06-10 16:01:32.385758] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:27.009 [2024-06-10 16:01:32.387093] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:27.010 [2024-06-10 16:01:32.387171] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xdbcda0 00:22:27.010 [2024-06-10 16:01:32.387181] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:22:27.010 [2024-06-10 16:01:32.387386] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdb9780 00:22:27.010 [2024-06-10 16:01:32.387528] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xdbcda0 00:22:27.010 [2024-06-10 16:01:32.387537] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xdbcda0 00:22:27.010 [2024-06-10 16:01:32.387651] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:27.010 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:27.268 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:27.268 "name": "raid_bdev1", 00:22:27.268 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:27.268 "strip_size_kb": 0, 00:22:27.268 "state": "online", 00:22:27.268 "raid_level": "raid1", 00:22:27.268 "superblock": false, 00:22:27.268 "num_base_bdevs": 2, 00:22:27.268 "num_base_bdevs_discovered": 2, 00:22:27.268 "num_base_bdevs_operational": 2, 00:22:27.268 "base_bdevs_list": [ 00:22:27.268 { 00:22:27.268 "name": "BaseBdev1", 00:22:27.268 "uuid": "a7a98fe2-2fed-5f20-99a3-40f53c71a033", 00:22:27.268 "is_configured": true, 00:22:27.268 "data_offset": 0, 00:22:27.268 "data_size": 65536 00:22:27.268 }, 00:22:27.268 { 00:22:27.268 "name": "BaseBdev2", 00:22:27.268 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:27.268 "is_configured": true, 00:22:27.268 "data_offset": 0, 00:22:27.268 "data_size": 65536 00:22:27.268 } 00:22:27.268 ] 00:22:27.268 }' 00:22:27.268 16:01:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:27.268 16:01:32 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:27.835 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:27.835 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:28.094 [2024-06-10 16:01:33.509005] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:28.094 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:22:28.094 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.094 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:28.353 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:22:28.353 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:28.353 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:28.353 16:01:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:28.612 [2024-06-10 16:01:33.903803] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbf5b0 00:22:28.612 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:28.612 Zero copy mechanism will not be used. 00:22:28.612 Running I/O for 60 seconds... 00:22:28.612 [2024-06-10 16:01:34.026578] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:28.612 [2024-06-10 16:01:34.035476] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xdbf5b0 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:28.612 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:28.872 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:28.872 "name": "raid_bdev1", 00:22:28.872 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:28.872 "strip_size_kb": 0, 00:22:28.872 "state": "online", 00:22:28.872 "raid_level": "raid1", 00:22:28.872 "superblock": false, 00:22:28.872 "num_base_bdevs": 2, 00:22:28.872 "num_base_bdevs_discovered": 1, 00:22:28.872 "num_base_bdevs_operational": 1, 00:22:28.872 "base_bdevs_list": [ 00:22:28.872 { 00:22:28.872 "name": null, 00:22:28.872 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:28.872 "is_configured": false, 00:22:28.872 "data_offset": 0, 00:22:28.872 "data_size": 65536 00:22:28.872 }, 00:22:28.872 { 00:22:28.872 "name": "BaseBdev2", 00:22:28.872 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:28.872 "is_configured": true, 00:22:28.872 "data_offset": 0, 00:22:28.872 "data_size": 65536 00:22:28.872 } 00:22:28.872 ] 00:22:28.872 }' 00:22:28.872 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:28.872 16:01:34 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:29.810 16:01:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:29.810 [2024-06-10 16:01:35.216331] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:29.810 16:01:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:29.810 [2024-06-10 16:01:35.261254] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbfac0 00:22:29.810 [2024-06-10 16:01:35.263578] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:30.069 [2024-06-10 16:01:35.386410] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:30.329 [2024-06-10 16:01:35.607170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:30.329 [2024-06-10 16:01:35.607330] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:30.589 [2024-06-10 16:01:35.980954] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:30.589 [2024-06-10 16:01:35.981184] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:30.848 [2024-06-10 16:01:36.201510] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:30.848 [2024-06-10 16:01:36.201677] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:30.848 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:30.849 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:31.108 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:31.108 "name": "raid_bdev1", 00:22:31.108 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:31.108 "strip_size_kb": 0, 00:22:31.108 "state": "online", 00:22:31.108 "raid_level": "raid1", 00:22:31.108 "superblock": false, 00:22:31.108 "num_base_bdevs": 2, 00:22:31.108 "num_base_bdevs_discovered": 2, 00:22:31.108 "num_base_bdevs_operational": 2, 00:22:31.108 "process": { 00:22:31.108 "type": "rebuild", 00:22:31.108 "target": "spare", 00:22:31.108 "progress": { 00:22:31.108 "blocks": 14336, 00:22:31.108 "percent": 21 00:22:31.108 } 00:22:31.108 }, 00:22:31.108 "base_bdevs_list": [ 00:22:31.108 { 00:22:31.108 "name": "spare", 00:22:31.108 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:31.108 "is_configured": true, 00:22:31.108 "data_offset": 0, 00:22:31.108 "data_size": 65536 00:22:31.108 }, 00:22:31.108 { 00:22:31.108 "name": "BaseBdev2", 00:22:31.108 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:31.108 "is_configured": true, 00:22:31.108 "data_offset": 0, 00:22:31.108 "data_size": 65536 00:22:31.108 } 00:22:31.108 ] 00:22:31.108 }' 00:22:31.108 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:31.108 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:31.108 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:31.367 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:31.367 16:01:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:31.368 [2024-06-10 16:01:36.829526] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:31.368 [2024-06-10 16:01:36.857636] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:31.627 [2024-06-10 16:01:37.058254] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:31.627 [2024-06-10 16:01:37.077359] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:31.627 [2024-06-10 16:01:37.077385] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:31.627 [2024-06-10 16:01:37.077393] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:31.627 [2024-06-10 16:01:37.109509] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0xdbf5b0 00:22:31.886 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:31.886 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:31.886 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:31.886 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:31.886 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:31.887 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.146 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:32.146 "name": "raid_bdev1", 00:22:32.146 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:32.146 "strip_size_kb": 0, 00:22:32.146 "state": "online", 00:22:32.146 "raid_level": "raid1", 00:22:32.146 "superblock": false, 00:22:32.146 "num_base_bdevs": 2, 00:22:32.146 "num_base_bdevs_discovered": 1, 00:22:32.146 "num_base_bdevs_operational": 1, 00:22:32.146 "base_bdevs_list": [ 00:22:32.146 { 00:22:32.146 "name": null, 00:22:32.146 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.146 "is_configured": false, 00:22:32.146 "data_offset": 0, 00:22:32.146 "data_size": 65536 00:22:32.146 }, 00:22:32.146 { 00:22:32.146 "name": "BaseBdev2", 00:22:32.146 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:32.146 "is_configured": true, 00:22:32.146 "data_offset": 0, 00:22:32.146 "data_size": 65536 00:22:32.146 } 00:22:32.146 ] 00:22:32.146 }' 00:22:32.146 16:01:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:32.146 16:01:37 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:32.715 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:32.975 "name": "raid_bdev1", 00:22:32.975 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:32.975 "strip_size_kb": 0, 00:22:32.975 "state": "online", 00:22:32.975 "raid_level": "raid1", 00:22:32.975 "superblock": false, 00:22:32.975 "num_base_bdevs": 2, 00:22:32.975 "num_base_bdevs_discovered": 1, 00:22:32.975 "num_base_bdevs_operational": 1, 00:22:32.975 "base_bdevs_list": [ 00:22:32.975 { 00:22:32.975 "name": null, 00:22:32.975 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:32.975 "is_configured": false, 00:22:32.975 "data_offset": 0, 00:22:32.975 "data_size": 65536 00:22:32.975 }, 00:22:32.975 { 00:22:32.975 "name": "BaseBdev2", 00:22:32.975 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:32.975 "is_configured": true, 00:22:32.975 "data_offset": 0, 00:22:32.975 "data_size": 65536 00:22:32.975 } 00:22:32.975 ] 00:22:32.975 }' 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:32.975 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:33.235 [2024-06-10 16:01:38.675845] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:33.235 16:01:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:33.494 [2024-06-10 16:01:38.747757] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xdbfac0 00:22:33.494 [2024-06-10 16:01:38.749272] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:33.494 [2024-06-10 16:01:38.900093] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:33.754 [2024-06-10 16:01:39.191877] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:34.323 [2024-06-10 16:01:39.533817] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:34.323 [2024-06-10 16:01:39.534098] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.323 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:34.323 [2024-06-10 16:01:39.745933] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:34.582 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:34.582 "name": "raid_bdev1", 00:22:34.582 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:34.582 "strip_size_kb": 0, 00:22:34.582 "state": "online", 00:22:34.582 "raid_level": "raid1", 00:22:34.582 "superblock": false, 00:22:34.582 "num_base_bdevs": 2, 00:22:34.582 "num_base_bdevs_discovered": 2, 00:22:34.582 "num_base_bdevs_operational": 2, 00:22:34.582 "process": { 00:22:34.582 "type": "rebuild", 00:22:34.582 "target": "spare", 00:22:34.582 "progress": { 00:22:34.582 "blocks": 12288, 00:22:34.582 "percent": 18 00:22:34.582 } 00:22:34.582 }, 00:22:34.582 "base_bdevs_list": [ 00:22:34.582 { 00:22:34.582 "name": "spare", 00:22:34.582 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:34.582 "is_configured": true, 00:22:34.582 "data_offset": 0, 00:22:34.582 "data_size": 65536 00:22:34.582 }, 00:22:34.582 { 00:22:34.582 "name": "BaseBdev2", 00:22:34.582 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:34.582 "is_configured": true, 00:22:34.582 "data_offset": 0, 00:22:34.582 "data_size": 65536 00:22:34.582 } 00:22:34.582 ] 00:22:34.582 }' 00:22:34.582 16:01:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:34.582 [2024-06-10 16:01:40.062075] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:34.582 [2024-06-10 16:01:40.062379] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=822 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:34.582 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:35.152 "name": "raid_bdev1", 00:22:35.152 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:35.152 "strip_size_kb": 0, 00:22:35.152 "state": "online", 00:22:35.152 "raid_level": "raid1", 00:22:35.152 "superblock": false, 00:22:35.152 "num_base_bdevs": 2, 00:22:35.152 "num_base_bdevs_discovered": 2, 00:22:35.152 "num_base_bdevs_operational": 2, 00:22:35.152 "process": { 00:22:35.152 "type": "rebuild", 00:22:35.152 "target": "spare", 00:22:35.152 "progress": { 00:22:35.152 "blocks": 16384, 00:22:35.152 "percent": 25 00:22:35.152 } 00:22:35.152 }, 00:22:35.152 "base_bdevs_list": [ 00:22:35.152 { 00:22:35.152 "name": "spare", 00:22:35.152 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:35.152 "is_configured": true, 00:22:35.152 "data_offset": 0, 00:22:35.152 "data_size": 65536 00:22:35.152 }, 00:22:35.152 { 00:22:35.152 "name": "BaseBdev2", 00:22:35.152 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:35.152 "is_configured": true, 00:22:35.152 "data_offset": 0, 00:22:35.152 "data_size": 65536 00:22:35.152 } 00:22:35.152 ] 00:22:35.152 }' 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:35.152 16:01:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:35.152 [2024-06-10 16:01:40.633532] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:35.411 [2024-06-10 16:01:40.873846] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:22:35.411 [2024-06-10 16:01:40.874142] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:22:35.979 [2024-06-10 16:01:41.336793] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:35.979 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:36.238 [2024-06-10 16:01:41.679979] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:36.239 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:36.239 "name": "raid_bdev1", 00:22:36.239 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:36.239 "strip_size_kb": 0, 00:22:36.239 "state": "online", 00:22:36.239 "raid_level": "raid1", 00:22:36.239 "superblock": false, 00:22:36.239 "num_base_bdevs": 2, 00:22:36.239 "num_base_bdevs_discovered": 2, 00:22:36.239 "num_base_bdevs_operational": 2, 00:22:36.239 "process": { 00:22:36.239 "type": "rebuild", 00:22:36.239 "target": "spare", 00:22:36.239 "progress": { 00:22:36.239 "blocks": 40960, 00:22:36.239 "percent": 62 00:22:36.239 } 00:22:36.239 }, 00:22:36.239 "base_bdevs_list": [ 00:22:36.239 { 00:22:36.239 "name": "spare", 00:22:36.239 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:36.239 "is_configured": true, 00:22:36.239 "data_offset": 0, 00:22:36.239 "data_size": 65536 00:22:36.239 }, 00:22:36.239 { 00:22:36.239 "name": "BaseBdev2", 00:22:36.239 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:36.239 "is_configured": true, 00:22:36.239 "data_offset": 0, 00:22:36.239 "data_size": 65536 00:22:36.239 } 00:22:36.239 ] 00:22:36.239 }' 00:22:36.239 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:36.498 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:36.498 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:36.498 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:36.498 16:01:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:37.066 [2024-06-10 16:01:42.361252] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 53248 offset_begin: 49152 offset_end: 55296 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:37.326 16:01:42 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:37.585 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:37.585 "name": "raid_bdev1", 00:22:37.585 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:37.585 "strip_size_kb": 0, 00:22:37.585 "state": "online", 00:22:37.585 "raid_level": "raid1", 00:22:37.585 "superblock": false, 00:22:37.585 "num_base_bdevs": 2, 00:22:37.585 "num_base_bdevs_discovered": 2, 00:22:37.585 "num_base_bdevs_operational": 2, 00:22:37.585 "process": { 00:22:37.585 "type": "rebuild", 00:22:37.585 "target": "spare", 00:22:37.585 "progress": { 00:22:37.585 "blocks": 63488, 00:22:37.585 "percent": 96 00:22:37.585 } 00:22:37.585 }, 00:22:37.585 "base_bdevs_list": [ 00:22:37.585 { 00:22:37.585 "name": "spare", 00:22:37.585 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:37.585 "is_configured": true, 00:22:37.585 "data_offset": 0, 00:22:37.585 "data_size": 65536 00:22:37.585 }, 00:22:37.585 { 00:22:37.585 "name": "BaseBdev2", 00:22:37.585 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:37.585 "is_configured": true, 00:22:37.585 "data_offset": 0, 00:22:37.585 "data_size": 65536 00:22:37.585 } 00:22:37.585 ] 00:22:37.585 }' 00:22:37.585 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:37.844 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:37.844 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:37.844 [2024-06-10 16:01:43.137311] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:37.844 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:37.844 16:01:43 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:37.844 [2024-06-10 16:01:43.237565] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:37.844 [2024-06-10 16:01:43.248272] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:38.781 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.040 "name": "raid_bdev1", 00:22:39.040 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:39.040 "strip_size_kb": 0, 00:22:39.040 "state": "online", 00:22:39.040 "raid_level": "raid1", 00:22:39.040 "superblock": false, 00:22:39.040 "num_base_bdevs": 2, 00:22:39.040 "num_base_bdevs_discovered": 2, 00:22:39.040 "num_base_bdevs_operational": 2, 00:22:39.040 "base_bdevs_list": [ 00:22:39.040 { 00:22:39.040 "name": "spare", 00:22:39.040 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:39.040 "is_configured": true, 00:22:39.040 "data_offset": 0, 00:22:39.040 "data_size": 65536 00:22:39.040 }, 00:22:39.040 { 00:22:39.040 "name": "BaseBdev2", 00:22:39.040 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:39.040 "is_configured": true, 00:22:39.040 "data_offset": 0, 00:22:39.040 "data_size": 65536 00:22:39.040 } 00:22:39.040 ] 00:22:39.040 }' 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.040 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.299 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:39.299 "name": "raid_bdev1", 00:22:39.299 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:39.299 "strip_size_kb": 0, 00:22:39.299 "state": "online", 00:22:39.299 "raid_level": "raid1", 00:22:39.299 "superblock": false, 00:22:39.299 "num_base_bdevs": 2, 00:22:39.299 "num_base_bdevs_discovered": 2, 00:22:39.299 "num_base_bdevs_operational": 2, 00:22:39.299 "base_bdevs_list": [ 00:22:39.299 { 00:22:39.299 "name": "spare", 00:22:39.299 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:39.299 "is_configured": true, 00:22:39.299 "data_offset": 0, 00:22:39.299 "data_size": 65536 00:22:39.299 }, 00:22:39.299 { 00:22:39.299 "name": "BaseBdev2", 00:22:39.299 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:39.299 "is_configured": true, 00:22:39.299 "data_offset": 0, 00:22:39.299 "data_size": 65536 00:22:39.299 } 00:22:39.299 ] 00:22:39.299 }' 00:22:39.299 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:39.557 16:01:44 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:39.814 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:39.814 "name": "raid_bdev1", 00:22:39.814 "uuid": "c5db4361-c494-4231-97e7-04ce38752e0b", 00:22:39.814 "strip_size_kb": 0, 00:22:39.814 "state": "online", 00:22:39.814 "raid_level": "raid1", 00:22:39.814 "superblock": false, 00:22:39.814 "num_base_bdevs": 2, 00:22:39.814 "num_base_bdevs_discovered": 2, 00:22:39.814 "num_base_bdevs_operational": 2, 00:22:39.814 "base_bdevs_list": [ 00:22:39.814 { 00:22:39.814 "name": "spare", 00:22:39.814 "uuid": "44668580-09b7-5a9c-9e50-9586ce167ddb", 00:22:39.814 "is_configured": true, 00:22:39.814 "data_offset": 0, 00:22:39.814 "data_size": 65536 00:22:39.814 }, 00:22:39.814 { 00:22:39.814 "name": "BaseBdev2", 00:22:39.814 "uuid": "5ede6db1-6d6b-5452-bb5a-9e81964745e5", 00:22:39.814 "is_configured": true, 00:22:39.814 "data_offset": 0, 00:22:39.814 "data_size": 65536 00:22:39.814 } 00:22:39.814 ] 00:22:39.814 }' 00:22:39.814 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:39.814 16:01:45 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:40.382 16:01:45 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:40.642 [2024-06-10 16:01:46.010283] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:40.642 [2024-06-10 16:01:46.010315] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:40.642 00:22:40.642 Latency(us) 00:22:40.642 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:40.642 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:40.642 raid_bdev1 : 12.17 99.39 298.16 0.00 0.00 13791.89 302.32 119337.94 00:22:40.642 =================================================================================================================== 00:22:40.642 Total : 99.39 298.16 0.00 0.00 13791.89 302.32 119337.94 00:22:40.642 [2024-06-10 16:01:46.114722] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:40.642 [2024-06-10 16:01:46.114748] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:40.642 [2024-06-10 16:01:46.114823] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:40.642 [2024-06-10 16:01:46.114832] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xdbcda0 name raid_bdev1, state offline 00:22:40.642 0 00:22:40.642 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:40.642 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:40.926 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:41.207 /dev/nbd0 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:41.207 1+0 records in 00:22:41.207 1+0 records out 00:22:41.207 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000159643 s, 25.7 MB/s 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:41.207 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:22:41.466 /dev/nbd1 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:41.466 1+0 records in 00:22:41.466 1+0 records out 00:22:41.466 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000183839 s, 22.3 MB/s 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:41.466 16:01:46 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:41.725 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:22:41.985 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2772969 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 2772969 ']' 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 2772969 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2772969 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2772969' 00:22:42.245 killing process with pid 2772969 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 2772969 00:22:42.245 Received shutdown signal, test time was about 13.672003 seconds 00:22:42.245 00:22:42.245 Latency(us) 00:22:42.245 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:42.245 =================================================================================================================== 00:22:42.245 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:22:42.245 [2024-06-10 16:01:47.611355] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:22:42.245 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 2772969 00:22:42.245 [2024-06-10 16:01:47.630891] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:22:42.506 00:22:42.506 real 0m18.501s 00:22:42.506 user 0m28.896s 00:22:42.506 sys 0m2.237s 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:22:42.506 ************************************ 00:22:42.506 END TEST raid_rebuild_test_io 00:22:42.506 ************************************ 00:22:42.506 16:01:47 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 2 true true true 00:22:42.506 16:01:47 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:22:42.506 16:01:47 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:22:42.506 16:01:47 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:22:42.506 ************************************ 00:22:42.506 START TEST raid_rebuild_test_sb_io 00:22:42.506 ************************************ 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true true true 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2776154 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2776154 /var/tmp/spdk-raid.sock 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 2776154 ']' 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:22:42.506 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:22:42.506 16:01:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:42.506 [2024-06-10 16:01:47.965133] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:22:42.506 [2024-06-10 16:01:47.965187] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2776154 ] 00:22:42.506 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:42.506 Zero copy mechanism will not be used. 00:22:42.765 [2024-06-10 16:01:48.063831] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:22:42.765 [2024-06-10 16:01:48.152203] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:22:42.765 [2024-06-10 16:01:48.212320] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:42.765 [2024-06-10 16:01:48.212353] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:22:43.703 16:01:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:22:43.703 16:01:48 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:22:43.703 16:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:43.703 16:01:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:22:43.703 BaseBdev1_malloc 00:22:43.703 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:22:43.963 [2024-06-10 16:01:49.402439] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:22:43.963 [2024-06-10 16:01:49.402487] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:43.963 [2024-06-10 16:01:49.402505] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173ce90 00:22:43.963 [2024-06-10 16:01:49.402514] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:43.963 [2024-06-10 16:01:49.404133] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:43.963 [2024-06-10 16:01:49.404160] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:22:43.963 BaseBdev1 00:22:43.963 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:22:43.963 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:22:44.222 BaseBdev2_malloc 00:22:44.222 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:22:44.481 [2024-06-10 16:01:49.904266] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:22:44.481 [2024-06-10 16:01:49.904305] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:44.481 [2024-06-10 16:01:49.904329] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x173d9e0 00:22:44.481 [2024-06-10 16:01:49.904339] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:44.481 [2024-06-10 16:01:49.905783] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:44.481 [2024-06-10 16:01:49.905807] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:22:44.481 BaseBdev2 00:22:44.481 16:01:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:22:44.740 spare_malloc 00:22:44.740 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:22:44.999 spare_delay 00:22:44.999 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:22:45.258 [2024-06-10 16:01:50.670823] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:22:45.258 [2024-06-10 16:01:50.670860] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:22:45.258 [2024-06-10 16:01:50.670876] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18eb890 00:22:45.258 [2024-06-10 16:01:50.670885] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:22:45.258 [2024-06-10 16:01:50.672364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:22:45.258 [2024-06-10 16:01:50.672390] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:22:45.258 spare 00:22:45.258 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:22:45.517 [2024-06-10 16:01:50.919501] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:22:45.517 [2024-06-10 16:01:50.920731] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:22:45.517 [2024-06-10 16:01:50.920888] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18eada0 00:22:45.517 [2024-06-10 16:01:50.920900] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:22:45.517 [2024-06-10 16:01:50.921093] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173b970 00:22:45.517 [2024-06-10 16:01:50.921231] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18eada0 00:22:45.517 [2024-06-10 16:01:50.921240] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18eada0 00:22:45.517 [2024-06-10 16:01:50.921332] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:45.517 16:01:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:45.776 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:45.776 "name": "raid_bdev1", 00:22:45.776 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:45.776 "strip_size_kb": 0, 00:22:45.776 "state": "online", 00:22:45.776 "raid_level": "raid1", 00:22:45.776 "superblock": true, 00:22:45.776 "num_base_bdevs": 2, 00:22:45.776 "num_base_bdevs_discovered": 2, 00:22:45.776 "num_base_bdevs_operational": 2, 00:22:45.776 "base_bdevs_list": [ 00:22:45.776 { 00:22:45.776 "name": "BaseBdev1", 00:22:45.776 "uuid": "338f8c2a-ace0-521b-a896-e663e78d43bf", 00:22:45.776 "is_configured": true, 00:22:45.776 "data_offset": 2048, 00:22:45.776 "data_size": 63488 00:22:45.776 }, 00:22:45.776 { 00:22:45.776 "name": "BaseBdev2", 00:22:45.776 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:45.776 "is_configured": true, 00:22:45.776 "data_offset": 2048, 00:22:45.776 "data_size": 63488 00:22:45.776 } 00:22:45.776 ] 00:22:45.776 }' 00:22:45.776 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:45.776 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:46.342 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:22:46.343 16:01:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:22:46.601 [2024-06-10 16:01:52.078842] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:22:46.601 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:22:46.601 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:46.601 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:22:46.860 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:22:46.860 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:22:46.860 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:22:46.860 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:22:47.119 [2024-06-10 16:01:52.457611] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ee680 00:22:47.119 I/O size of 3145728 is greater than zero copy threshold (65536). 00:22:47.119 Zero copy mechanism will not be used. 00:22:47.119 Running I/O for 60 seconds... 00:22:47.119 [2024-06-10 16:01:52.520550] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:22:47.119 [2024-06-10 16:01:52.529458] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18ee680 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:47.119 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:47.377 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:47.377 "name": "raid_bdev1", 00:22:47.377 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:47.377 "strip_size_kb": 0, 00:22:47.377 "state": "online", 00:22:47.377 "raid_level": "raid1", 00:22:47.377 "superblock": true, 00:22:47.377 "num_base_bdevs": 2, 00:22:47.377 "num_base_bdevs_discovered": 1, 00:22:47.378 "num_base_bdevs_operational": 1, 00:22:47.378 "base_bdevs_list": [ 00:22:47.378 { 00:22:47.378 "name": null, 00:22:47.378 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:47.378 "is_configured": false, 00:22:47.378 "data_offset": 2048, 00:22:47.378 "data_size": 63488 00:22:47.378 }, 00:22:47.378 { 00:22:47.378 "name": "BaseBdev2", 00:22:47.378 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:47.378 "is_configured": true, 00:22:47.378 "data_offset": 2048, 00:22:47.378 "data_size": 63488 00:22:47.378 } 00:22:47.378 ] 00:22:47.378 }' 00:22:47.378 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:47.378 16:01:52 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:48.314 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:48.314 [2024-06-10 16:01:53.735392] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:48.314 [2024-06-10 16:01:53.781329] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1870830 00:22:48.314 [2024-06-10 16:01:53.783493] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:48.314 16:01:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:22:48.572 [2024-06-10 16:01:53.914950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:48.831 [2024-06-10 16:01:54.153503] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:48.831 [2024-06-10 16:01:54.153632] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:49.091 [2024-06-10 16:01:54.513038] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:49.091 [2024-06-10 16:01:54.513332] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:22:49.350 [2024-06-10 16:01:54.716652] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:49.350 16:01:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:49.608 [2024-06-10 16:01:54.920924] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:22:49.608 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:49.608 "name": "raid_bdev1", 00:22:49.608 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:49.608 "strip_size_kb": 0, 00:22:49.608 "state": "online", 00:22:49.608 "raid_level": "raid1", 00:22:49.608 "superblock": true, 00:22:49.608 "num_base_bdevs": 2, 00:22:49.608 "num_base_bdevs_discovered": 2, 00:22:49.608 "num_base_bdevs_operational": 2, 00:22:49.608 "process": { 00:22:49.608 "type": "rebuild", 00:22:49.608 "target": "spare", 00:22:49.609 "progress": { 00:22:49.609 "blocks": 16384, 00:22:49.609 "percent": 25 00:22:49.609 } 00:22:49.609 }, 00:22:49.609 "base_bdevs_list": [ 00:22:49.609 { 00:22:49.609 "name": "spare", 00:22:49.609 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:49.609 "is_configured": true, 00:22:49.609 "data_offset": 2048, 00:22:49.609 "data_size": 63488 00:22:49.609 }, 00:22:49.609 { 00:22:49.609 "name": "BaseBdev2", 00:22:49.609 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:49.609 "is_configured": true, 00:22:49.609 "data_offset": 2048, 00:22:49.609 "data_size": 63488 00:22:49.609 } 00:22:49.609 ] 00:22:49.609 }' 00:22:49.609 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:49.609 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:49.609 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:49.867 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:49.867 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:22:49.867 [2024-06-10 16:01:55.248346] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:22:50.126 [2024-06-10 16:01:55.379108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:50.126 [2024-06-10 16:01:55.379176] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:22:50.126 [2024-06-10 16:01:55.488091] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:22:50.126 [2024-06-10 16:01:55.489674] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:50.126 [2024-06-10 16:01:55.489700] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:22:50.126 [2024-06-10 16:01:55.489709] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:22:50.126 [2024-06-10 16:01:55.530568] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x18ee680 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:50.126 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:50.384 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:50.384 "name": "raid_bdev1", 00:22:50.384 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:50.384 "strip_size_kb": 0, 00:22:50.384 "state": "online", 00:22:50.384 "raid_level": "raid1", 00:22:50.384 "superblock": true, 00:22:50.384 "num_base_bdevs": 2, 00:22:50.384 "num_base_bdevs_discovered": 1, 00:22:50.384 "num_base_bdevs_operational": 1, 00:22:50.384 "base_bdevs_list": [ 00:22:50.384 { 00:22:50.384 "name": null, 00:22:50.384 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:50.384 "is_configured": false, 00:22:50.384 "data_offset": 2048, 00:22:50.384 "data_size": 63488 00:22:50.384 }, 00:22:50.384 { 00:22:50.384 "name": "BaseBdev2", 00:22:50.384 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:50.384 "is_configured": true, 00:22:50.384 "data_offset": 2048, 00:22:50.384 "data_size": 63488 00:22:50.384 } 00:22:50.384 ] 00:22:50.384 }' 00:22:50.384 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:50.384 16:01:55 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:51.319 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:51.319 "name": "raid_bdev1", 00:22:51.319 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:51.319 "strip_size_kb": 0, 00:22:51.319 "state": "online", 00:22:51.319 "raid_level": "raid1", 00:22:51.319 "superblock": true, 00:22:51.319 "num_base_bdevs": 2, 00:22:51.319 "num_base_bdevs_discovered": 1, 00:22:51.319 "num_base_bdevs_operational": 1, 00:22:51.319 "base_bdevs_list": [ 00:22:51.319 { 00:22:51.319 "name": null, 00:22:51.319 "uuid": "00000000-0000-0000-0000-000000000000", 00:22:51.319 "is_configured": false, 00:22:51.319 "data_offset": 2048, 00:22:51.319 "data_size": 63488 00:22:51.319 }, 00:22:51.319 { 00:22:51.319 "name": "BaseBdev2", 00:22:51.319 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:51.319 "is_configured": true, 00:22:51.319 "data_offset": 2048, 00:22:51.319 "data_size": 63488 00:22:51.319 } 00:22:51.319 ] 00:22:51.320 }' 00:22:51.320 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:51.320 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:51.320 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:51.578 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:51.578 16:01:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:22:51.836 [2024-06-10 16:01:57.095323] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:22:51.836 [2024-06-10 16:01:57.158583] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1874100 00:22:51.836 [2024-06-10 16:01:57.160122] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:22:51.836 16:01:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:22:51.837 [2024-06-10 16:01:57.288537] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:51.837 [2024-06-10 16:01:57.288800] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:22:52.095 [2024-06-10 16:01:57.530236] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:52.095 [2024-06-10 16:01:57.530472] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:22:52.662 [2024-06-10 16:01:58.028323] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:52.662 [2024-06-10 16:01:58.028525] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:52.662 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:52.932 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:52.932 "name": "raid_bdev1", 00:22:52.932 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:52.932 "strip_size_kb": 0, 00:22:52.932 "state": "online", 00:22:52.932 "raid_level": "raid1", 00:22:52.932 "superblock": true, 00:22:52.932 "num_base_bdevs": 2, 00:22:52.932 "num_base_bdevs_discovered": 2, 00:22:52.932 "num_base_bdevs_operational": 2, 00:22:52.932 "process": { 00:22:52.932 "type": "rebuild", 00:22:52.932 "target": "spare", 00:22:52.932 "progress": { 00:22:52.932 "blocks": 12288, 00:22:52.932 "percent": 19 00:22:52.932 } 00:22:52.932 }, 00:22:52.932 "base_bdevs_list": [ 00:22:52.932 { 00:22:52.932 "name": "spare", 00:22:52.932 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:52.932 "is_configured": true, 00:22:52.932 "data_offset": 2048, 00:22:52.932 "data_size": 63488 00:22:52.932 }, 00:22:52.932 { 00:22:52.932 "name": "BaseBdev2", 00:22:52.932 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:52.932 "is_configured": true, 00:22:52.932 "data_offset": 2048, 00:22:52.932 "data_size": 63488 00:22:52.932 } 00:22:52.932 ] 00:22:52.932 }' 00:22:52.932 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:22:53.190 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=840 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:53.190 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:53.449 "name": "raid_bdev1", 00:22:53.449 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:53.449 "strip_size_kb": 0, 00:22:53.449 "state": "online", 00:22:53.449 "raid_level": "raid1", 00:22:53.449 "superblock": true, 00:22:53.449 "num_base_bdevs": 2, 00:22:53.449 "num_base_bdevs_discovered": 2, 00:22:53.449 "num_base_bdevs_operational": 2, 00:22:53.449 "process": { 00:22:53.449 "type": "rebuild", 00:22:53.449 "target": "spare", 00:22:53.449 "progress": { 00:22:53.449 "blocks": 18432, 00:22:53.449 "percent": 29 00:22:53.449 } 00:22:53.449 }, 00:22:53.449 "base_bdevs_list": [ 00:22:53.449 { 00:22:53.449 "name": "spare", 00:22:53.449 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:53.449 "is_configured": true, 00:22:53.449 "data_offset": 2048, 00:22:53.449 "data_size": 63488 00:22:53.449 }, 00:22:53.449 { 00:22:53.449 "name": "BaseBdev2", 00:22:53.449 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:53.449 "is_configured": true, 00:22:53.449 "data_offset": 2048, 00:22:53.449 "data_size": 63488 00:22:53.449 } 00:22:53.449 ] 00:22:53.449 }' 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:53.449 16:01:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:53.708 [2024-06-10 16:01:59.205207] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:22:53.708 [2024-06-10 16:01:59.205435] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:22:53.967 [2024-06-10 16:01:59.443916] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:54.535 [2024-06-10 16:01:59.893740] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:54.535 16:01:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:54.794 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:54.794 "name": "raid_bdev1", 00:22:54.794 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:54.794 "strip_size_kb": 0, 00:22:54.794 "state": "online", 00:22:54.794 "raid_level": "raid1", 00:22:54.794 "superblock": true, 00:22:54.794 "num_base_bdevs": 2, 00:22:54.794 "num_base_bdevs_discovered": 2, 00:22:54.794 "num_base_bdevs_operational": 2, 00:22:54.794 "process": { 00:22:54.794 "type": "rebuild", 00:22:54.794 "target": "spare", 00:22:54.794 "progress": { 00:22:54.794 "blocks": 36864, 00:22:54.794 "percent": 58 00:22:54.794 } 00:22:54.794 }, 00:22:54.794 "base_bdevs_list": [ 00:22:54.794 { 00:22:54.794 "name": "spare", 00:22:54.794 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:54.794 "is_configured": true, 00:22:54.794 "data_offset": 2048, 00:22:54.794 "data_size": 63488 00:22:54.794 }, 00:22:54.794 { 00:22:54.794 "name": "BaseBdev2", 00:22:54.794 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:54.794 "is_configured": true, 00:22:54.794 "data_offset": 2048, 00:22:54.794 "data_size": 63488 00:22:54.794 } 00:22:54.794 ] 00:22:54.794 }' 00:22:54.794 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:54.794 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:54.794 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:54.795 [2024-06-10 16:02:00.226169] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:22:54.795 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:54.795 16:02:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:55.055 [2024-06-10 16:02:00.345305] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:22:55.348 [2024-06-10 16:02:00.686760] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:55.916 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:56.175 [2024-06-10 16:02:01.460748] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:56.175 "name": "raid_bdev1", 00:22:56.175 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:56.175 "strip_size_kb": 0, 00:22:56.175 "state": "online", 00:22:56.175 "raid_level": "raid1", 00:22:56.175 "superblock": true, 00:22:56.175 "num_base_bdevs": 2, 00:22:56.175 "num_base_bdevs_discovered": 2, 00:22:56.175 "num_base_bdevs_operational": 2, 00:22:56.175 "process": { 00:22:56.175 "type": "rebuild", 00:22:56.175 "target": "spare", 00:22:56.175 "progress": { 00:22:56.175 "blocks": 57344, 00:22:56.175 "percent": 90 00:22:56.175 } 00:22:56.175 }, 00:22:56.175 "base_bdevs_list": [ 00:22:56.175 { 00:22:56.175 "name": "spare", 00:22:56.175 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:56.175 "is_configured": true, 00:22:56.175 "data_offset": 2048, 00:22:56.175 "data_size": 63488 00:22:56.175 }, 00:22:56.175 { 00:22:56.175 "name": "BaseBdev2", 00:22:56.175 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:56.175 "is_configured": true, 00:22:56.175 "data_offset": 2048, 00:22:56.175 "data_size": 63488 00:22:56.175 } 00:22:56.175 ] 00:22:56.175 }' 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:22:56.175 16:02:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:22:56.434 [2024-06-10 16:02:01.812752] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:22:56.434 [2024-06-10 16:02:01.921754] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:22:56.434 [2024-06-10 16:02:01.924109] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:57.369 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.370 "name": "raid_bdev1", 00:22:57.370 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:57.370 "strip_size_kb": 0, 00:22:57.370 "state": "online", 00:22:57.370 "raid_level": "raid1", 00:22:57.370 "superblock": true, 00:22:57.370 "num_base_bdevs": 2, 00:22:57.370 "num_base_bdevs_discovered": 2, 00:22:57.370 "num_base_bdevs_operational": 2, 00:22:57.370 "base_bdevs_list": [ 00:22:57.370 { 00:22:57.370 "name": "spare", 00:22:57.370 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:57.370 "is_configured": true, 00:22:57.370 "data_offset": 2048, 00:22:57.370 "data_size": 63488 00:22:57.370 }, 00:22:57.370 { 00:22:57.370 "name": "BaseBdev2", 00:22:57.370 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:57.370 "is_configured": true, 00:22:57.370 "data_offset": 2048, 00:22:57.370 "data_size": 63488 00:22:57.370 } 00:22:57.370 ] 00:22:57.370 }' 00:22:57.370 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.628 16:02:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:22:57.886 "name": "raid_bdev1", 00:22:57.886 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:57.886 "strip_size_kb": 0, 00:22:57.886 "state": "online", 00:22:57.886 "raid_level": "raid1", 00:22:57.886 "superblock": true, 00:22:57.886 "num_base_bdevs": 2, 00:22:57.886 "num_base_bdevs_discovered": 2, 00:22:57.886 "num_base_bdevs_operational": 2, 00:22:57.886 "base_bdevs_list": [ 00:22:57.886 { 00:22:57.886 "name": "spare", 00:22:57.886 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:57.886 "is_configured": true, 00:22:57.886 "data_offset": 2048, 00:22:57.886 "data_size": 63488 00:22:57.886 }, 00:22:57.886 { 00:22:57.886 "name": "BaseBdev2", 00:22:57.886 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:57.886 "is_configured": true, 00:22:57.886 "data_offset": 2048, 00:22:57.886 "data_size": 63488 00:22:57.886 } 00:22:57.886 ] 00:22:57.886 }' 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:57.886 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:22:58.145 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:22:58.145 "name": "raid_bdev1", 00:22:58.145 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:22:58.145 "strip_size_kb": 0, 00:22:58.145 "state": "online", 00:22:58.145 "raid_level": "raid1", 00:22:58.145 "superblock": true, 00:22:58.145 "num_base_bdevs": 2, 00:22:58.145 "num_base_bdevs_discovered": 2, 00:22:58.145 "num_base_bdevs_operational": 2, 00:22:58.145 "base_bdevs_list": [ 00:22:58.145 { 00:22:58.145 "name": "spare", 00:22:58.145 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:22:58.145 "is_configured": true, 00:22:58.145 "data_offset": 2048, 00:22:58.145 "data_size": 63488 00:22:58.145 }, 00:22:58.145 { 00:22:58.145 "name": "BaseBdev2", 00:22:58.145 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:22:58.145 "is_configured": true, 00:22:58.145 "data_offset": 2048, 00:22:58.145 "data_size": 63488 00:22:58.145 } 00:22:58.145 ] 00:22:58.145 }' 00:22:58.145 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:22:58.145 16:02:03 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:22:58.713 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:22:58.973 [2024-06-10 16:02:04.427333] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:22:58.973 [2024-06-10 16:02:04.427365] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:22:58.973 00:22:58.973 Latency(us) 00:22:58.973 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:22:58.973 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:22:58.973 raid_bdev1 : 11.97 90.19 270.58 0.00 0.00 14759.78 294.52 120835.90 00:22:58.973 =================================================================================================================== 00:22:58.973 Total : 90.19 270.58 0.00 0.00 14759.78 294.52 120835.90 00:22:58.973 [2024-06-10 16:02:04.467603] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:22:58.973 [2024-06-10 16:02:04.467629] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:22:58.973 [2024-06-10 16:02:04.467705] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:22:58.973 [2024-06-10 16:02:04.467715] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18eada0 name raid_bdev1, state offline 00:22:58.973 0 00:22:59.232 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:22:59.232 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:59.491 16:02:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:22:59.491 /dev/nbd0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:59.751 1+0 records in 00:22:59.751 1+0 records out 00:22:59.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000217424 s, 18.8 MB/s 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev2 ']' 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev2 /dev/nbd1 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev2') 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev2 /dev/nbd1 00:22:59.751 /dev/nbd1 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:22:59.751 1+0 records in 00:22:59.751 1+0 records out 00:22:59.751 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00022747 s, 18.0 MB/s 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:22:59.751 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.011 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:00.270 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:23:00.529 16:02:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:00.788 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:00.788 [2024-06-10 16:02:06.279333] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:00.788 [2024-06-10 16:02:06.279380] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:00.788 [2024-06-10 16:02:06.279399] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18e8700 00:23:00.788 [2024-06-10 16:02:06.279408] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:00.788 [2024-06-10 16:02:06.281151] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:00.788 [2024-06-10 16:02:06.281180] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:00.788 [2024-06-10 16:02:06.281261] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:00.788 [2024-06-10 16:02:06.281287] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:00.788 [2024-06-10 16:02:06.281395] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:00.788 spare 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.047 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:01.047 [2024-06-10 16:02:06.381715] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x18ec380 00:23:01.047 [2024-06-10 16:02:06.381728] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:01.047 [2024-06-10 16:02:06.381924] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x173c1a0 00:23:01.047 [2024-06-10 16:02:06.382078] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x18ec380 00:23:01.047 [2024-06-10 16:02:06.382087] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x18ec380 00:23:01.047 [2024-06-10 16:02:06.382198] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:01.306 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:01.306 "name": "raid_bdev1", 00:23:01.306 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:01.306 "strip_size_kb": 0, 00:23:01.306 "state": "online", 00:23:01.306 "raid_level": "raid1", 00:23:01.306 "superblock": true, 00:23:01.306 "num_base_bdevs": 2, 00:23:01.306 "num_base_bdevs_discovered": 2, 00:23:01.306 "num_base_bdevs_operational": 2, 00:23:01.306 "base_bdevs_list": [ 00:23:01.306 { 00:23:01.306 "name": "spare", 00:23:01.306 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:23:01.306 "is_configured": true, 00:23:01.306 "data_offset": 2048, 00:23:01.306 "data_size": 63488 00:23:01.306 }, 00:23:01.306 { 00:23:01.306 "name": "BaseBdev2", 00:23:01.306 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:01.306 "is_configured": true, 00:23:01.306 "data_offset": 2048, 00:23:01.306 "data_size": 63488 00:23:01.306 } 00:23:01.306 ] 00:23:01.306 }' 00:23:01.306 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:01.306 16:02:06 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:01.873 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:02.132 "name": "raid_bdev1", 00:23:02.132 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:02.132 "strip_size_kb": 0, 00:23:02.132 "state": "online", 00:23:02.132 "raid_level": "raid1", 00:23:02.132 "superblock": true, 00:23:02.132 "num_base_bdevs": 2, 00:23:02.132 "num_base_bdevs_discovered": 2, 00:23:02.132 "num_base_bdevs_operational": 2, 00:23:02.132 "base_bdevs_list": [ 00:23:02.132 { 00:23:02.132 "name": "spare", 00:23:02.132 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:23:02.132 "is_configured": true, 00:23:02.132 "data_offset": 2048, 00:23:02.132 "data_size": 63488 00:23:02.132 }, 00:23:02.132 { 00:23:02.132 "name": "BaseBdev2", 00:23:02.132 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:02.132 "is_configured": true, 00:23:02.132 "data_offset": 2048, 00:23:02.132 "data_size": 63488 00:23:02.132 } 00:23:02.132 ] 00:23:02.132 }' 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.132 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:23:02.391 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:23:02.391 16:02:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:02.650 [2024-06-10 16:02:08.024395] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:02.650 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:02.909 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:02.909 "name": "raid_bdev1", 00:23:02.909 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:02.909 "strip_size_kb": 0, 00:23:02.909 "state": "online", 00:23:02.909 "raid_level": "raid1", 00:23:02.909 "superblock": true, 00:23:02.909 "num_base_bdevs": 2, 00:23:02.909 "num_base_bdevs_discovered": 1, 00:23:02.909 "num_base_bdevs_operational": 1, 00:23:02.909 "base_bdevs_list": [ 00:23:02.909 { 00:23:02.909 "name": null, 00:23:02.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:02.909 "is_configured": false, 00:23:02.909 "data_offset": 2048, 00:23:02.909 "data_size": 63488 00:23:02.909 }, 00:23:02.909 { 00:23:02.909 "name": "BaseBdev2", 00:23:02.909 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:02.909 "is_configured": true, 00:23:02.909 "data_offset": 2048, 00:23:02.909 "data_size": 63488 00:23:02.909 } 00:23:02.909 ] 00:23:02.909 }' 00:23:02.909 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:02.909 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:03.477 16:02:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:03.736 [2024-06-10 16:02:09.151625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.736 [2024-06-10 16:02:09.151781] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:03.736 [2024-06-10 16:02:09.151795] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:03.736 [2024-06-10 16:02:09.151821] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:03.736 [2024-06-10 16:02:09.156889] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ee920 00:23:03.736 [2024-06-10 16:02:09.159038] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:03.736 16:02:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:23:04.672 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:04.672 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:04.672 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:04.930 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:04.930 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:04.930 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:04.931 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:05.190 "name": "raid_bdev1", 00:23:05.190 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:05.190 "strip_size_kb": 0, 00:23:05.190 "state": "online", 00:23:05.190 "raid_level": "raid1", 00:23:05.190 "superblock": true, 00:23:05.190 "num_base_bdevs": 2, 00:23:05.190 "num_base_bdevs_discovered": 2, 00:23:05.190 "num_base_bdevs_operational": 2, 00:23:05.190 "process": { 00:23:05.190 "type": "rebuild", 00:23:05.190 "target": "spare", 00:23:05.190 "progress": { 00:23:05.190 "blocks": 24576, 00:23:05.190 "percent": 38 00:23:05.190 } 00:23:05.190 }, 00:23:05.190 "base_bdevs_list": [ 00:23:05.190 { 00:23:05.190 "name": "spare", 00:23:05.190 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:23:05.190 "is_configured": true, 00:23:05.190 "data_offset": 2048, 00:23:05.190 "data_size": 63488 00:23:05.190 }, 00:23:05.190 { 00:23:05.190 "name": "BaseBdev2", 00:23:05.190 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:05.190 "is_configured": true, 00:23:05.190 "data_offset": 2048, 00:23:05.190 "data_size": 63488 00:23:05.190 } 00:23:05.190 ] 00:23:05.190 }' 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:05.190 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:05.449 [2024-06-10 16:02:10.770162] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:05.449 [2024-06-10 16:02:10.771430] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:05.449 [2024-06-10 16:02:10.771475] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:05.449 [2024-06-10 16:02:10.771490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:05.449 [2024-06-10 16:02:10.771497] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:05.449 16:02:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:05.708 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:05.708 "name": "raid_bdev1", 00:23:05.708 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:05.708 "strip_size_kb": 0, 00:23:05.708 "state": "online", 00:23:05.708 "raid_level": "raid1", 00:23:05.708 "superblock": true, 00:23:05.708 "num_base_bdevs": 2, 00:23:05.708 "num_base_bdevs_discovered": 1, 00:23:05.708 "num_base_bdevs_operational": 1, 00:23:05.708 "base_bdevs_list": [ 00:23:05.708 { 00:23:05.708 "name": null, 00:23:05.708 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:05.708 "is_configured": false, 00:23:05.708 "data_offset": 2048, 00:23:05.708 "data_size": 63488 00:23:05.708 }, 00:23:05.708 { 00:23:05.708 "name": "BaseBdev2", 00:23:05.708 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:05.708 "is_configured": true, 00:23:05.708 "data_offset": 2048, 00:23:05.708 "data_size": 63488 00:23:05.708 } 00:23:05.708 ] 00:23:05.708 }' 00:23:05.708 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:05.708 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:06.276 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:06.535 [2024-06-10 16:02:11.835005] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:06.535 [2024-06-10 16:02:11.835055] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:06.535 [2024-06-10 16:02:11.835078] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x18ec700 00:23:06.535 [2024-06-10 16:02:11.835088] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:06.535 [2024-06-10 16:02:11.835479] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:06.536 [2024-06-10 16:02:11.835496] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:06.536 [2024-06-10 16:02:11.835578] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:23:06.536 [2024-06-10 16:02:11.835589] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:23:06.536 [2024-06-10 16:02:11.835597] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:23:06.536 [2024-06-10 16:02:11.835618] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:06.536 [2024-06-10 16:02:11.840741] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x18ead60 00:23:06.536 spare 00:23:06.536 [2024-06-10 16:02:11.842261] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:06.536 16:02:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:07.487 16:02:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:07.746 "name": "raid_bdev1", 00:23:07.746 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:07.746 "strip_size_kb": 0, 00:23:07.746 "state": "online", 00:23:07.746 "raid_level": "raid1", 00:23:07.746 "superblock": true, 00:23:07.746 "num_base_bdevs": 2, 00:23:07.746 "num_base_bdevs_discovered": 2, 00:23:07.746 "num_base_bdevs_operational": 2, 00:23:07.746 "process": { 00:23:07.746 "type": "rebuild", 00:23:07.746 "target": "spare", 00:23:07.746 "progress": { 00:23:07.746 "blocks": 24576, 00:23:07.746 "percent": 38 00:23:07.746 } 00:23:07.746 }, 00:23:07.746 "base_bdevs_list": [ 00:23:07.746 { 00:23:07.746 "name": "spare", 00:23:07.746 "uuid": "4ad7e5ff-6169-5b4e-8bb1-c6365a0c0912", 00:23:07.746 "is_configured": true, 00:23:07.746 "data_offset": 2048, 00:23:07.746 "data_size": 63488 00:23:07.746 }, 00:23:07.746 { 00:23:07.746 "name": "BaseBdev2", 00:23:07.746 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:07.746 "is_configured": true, 00:23:07.746 "data_offset": 2048, 00:23:07.746 "data_size": 63488 00:23:07.746 } 00:23:07.746 ] 00:23:07.746 }' 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:07.746 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:23:08.005 [2024-06-10 16:02:13.454235] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:08.005 [2024-06-10 16:02:13.454607] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:08.005 [2024-06-10 16:02:13.454649] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:08.005 [2024-06-10 16:02:13.454663] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:08.005 [2024-06-10 16:02:13.454670] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:08.005 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.264 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:08.264 "name": "raid_bdev1", 00:23:08.264 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:08.264 "strip_size_kb": 0, 00:23:08.264 "state": "online", 00:23:08.264 "raid_level": "raid1", 00:23:08.264 "superblock": true, 00:23:08.264 "num_base_bdevs": 2, 00:23:08.264 "num_base_bdevs_discovered": 1, 00:23:08.264 "num_base_bdevs_operational": 1, 00:23:08.264 "base_bdevs_list": [ 00:23:08.264 { 00:23:08.264 "name": null, 00:23:08.264 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:08.264 "is_configured": false, 00:23:08.264 "data_offset": 2048, 00:23:08.264 "data_size": 63488 00:23:08.264 }, 00:23:08.264 { 00:23:08.264 "name": "BaseBdev2", 00:23:08.264 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:08.264 "is_configured": true, 00:23:08.264 "data_offset": 2048, 00:23:08.264 "data_size": 63488 00:23:08.264 } 00:23:08.264 ] 00:23:08.264 }' 00:23:08.264 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:08.264 16:02:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:08.832 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:09.091 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:09.091 "name": "raid_bdev1", 00:23:09.091 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:09.091 "strip_size_kb": 0, 00:23:09.091 "state": "online", 00:23:09.091 "raid_level": "raid1", 00:23:09.091 "superblock": true, 00:23:09.091 "num_base_bdevs": 2, 00:23:09.091 "num_base_bdevs_discovered": 1, 00:23:09.091 "num_base_bdevs_operational": 1, 00:23:09.091 "base_bdevs_list": [ 00:23:09.091 { 00:23:09.091 "name": null, 00:23:09.091 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:09.091 "is_configured": false, 00:23:09.091 "data_offset": 2048, 00:23:09.091 "data_size": 63488 00:23:09.091 }, 00:23:09.091 { 00:23:09.091 "name": "BaseBdev2", 00:23:09.091 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:09.091 "is_configured": true, 00:23:09.091 "data_offset": 2048, 00:23:09.091 "data_size": 63488 00:23:09.091 } 00:23:09.091 ] 00:23:09.091 }' 00:23:09.091 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:09.350 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:09.350 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:09.350 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:09.350 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:23:09.639 16:02:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:09.898 [2024-06-10 16:02:15.152038] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:09.898 [2024-06-10 16:02:15.152082] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:09.898 [2024-06-10 16:02:15.152102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x17500c0 00:23:09.898 [2024-06-10 16:02:15.152117] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:09.898 [2024-06-10 16:02:15.152469] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:09.898 [2024-06-10 16:02:15.152486] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:09.898 [2024-06-10 16:02:15.152551] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:23:09.898 [2024-06-10 16:02:15.152561] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:09.898 [2024-06-10 16:02:15.152569] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:09.898 BaseBdev1 00:23:09.898 16:02:15 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:10.841 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.100 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:11.100 "name": "raid_bdev1", 00:23:11.100 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:11.100 "strip_size_kb": 0, 00:23:11.100 "state": "online", 00:23:11.100 "raid_level": "raid1", 00:23:11.100 "superblock": true, 00:23:11.100 "num_base_bdevs": 2, 00:23:11.100 "num_base_bdevs_discovered": 1, 00:23:11.100 "num_base_bdevs_operational": 1, 00:23:11.100 "base_bdevs_list": [ 00:23:11.100 { 00:23:11.100 "name": null, 00:23:11.100 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.100 "is_configured": false, 00:23:11.100 "data_offset": 2048, 00:23:11.100 "data_size": 63488 00:23:11.100 }, 00:23:11.100 { 00:23:11.100 "name": "BaseBdev2", 00:23:11.100 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:11.100 "is_configured": true, 00:23:11.100 "data_offset": 2048, 00:23:11.100 "data_size": 63488 00:23:11.100 } 00:23:11.100 ] 00:23:11.100 }' 00:23:11.100 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:11.100 16:02:16 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:11.667 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:11.926 "name": "raid_bdev1", 00:23:11.926 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:11.926 "strip_size_kb": 0, 00:23:11.926 "state": "online", 00:23:11.926 "raid_level": "raid1", 00:23:11.926 "superblock": true, 00:23:11.926 "num_base_bdevs": 2, 00:23:11.926 "num_base_bdevs_discovered": 1, 00:23:11.926 "num_base_bdevs_operational": 1, 00:23:11.926 "base_bdevs_list": [ 00:23:11.926 { 00:23:11.926 "name": null, 00:23:11.926 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:11.926 "is_configured": false, 00:23:11.926 "data_offset": 2048, 00:23:11.926 "data_size": 63488 00:23:11.926 }, 00:23:11.926 { 00:23:11.926 "name": "BaseBdev2", 00:23:11.926 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:11.926 "is_configured": true, 00:23:11.926 "data_offset": 2048, 00:23:11.926 "data_size": 63488 00:23:11.926 } 00:23:11.926 ] 00:23:11.926 }' 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:23:11.926 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:23:12.185 [2024-06-10 16:02:17.538775] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:12.185 [2024-06-10 16:02:17.538906] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:23:12.185 [2024-06-10 16:02:17.538920] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:23:12.185 request: 00:23:12.185 { 00:23:12.185 "raid_bdev": "raid_bdev1", 00:23:12.185 "base_bdev": "BaseBdev1", 00:23:12.185 "method": "bdev_raid_add_base_bdev", 00:23:12.185 "req_id": 1 00:23:12.185 } 00:23:12.185 Got JSON-RPC error response 00:23:12.185 response: 00:23:12.185 { 00:23:12.185 "code": -22, 00:23:12.185 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:23:12.185 } 00:23:12.185 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:23:12.185 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:23:12.185 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:23:12.185 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:23:12.185 16:02:17 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.121 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:13.380 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:13.380 "name": "raid_bdev1", 00:23:13.380 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:13.380 "strip_size_kb": 0, 00:23:13.380 "state": "online", 00:23:13.380 "raid_level": "raid1", 00:23:13.380 "superblock": true, 00:23:13.380 "num_base_bdevs": 2, 00:23:13.380 "num_base_bdevs_discovered": 1, 00:23:13.380 "num_base_bdevs_operational": 1, 00:23:13.380 "base_bdevs_list": [ 00:23:13.380 { 00:23:13.380 "name": null, 00:23:13.380 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:13.380 "is_configured": false, 00:23:13.380 "data_offset": 2048, 00:23:13.380 "data_size": 63488 00:23:13.380 }, 00:23:13.380 { 00:23:13.380 "name": "BaseBdev2", 00:23:13.380 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:13.380 "is_configured": true, 00:23:13.380 "data_offset": 2048, 00:23:13.380 "data_size": 63488 00:23:13.380 } 00:23:13.380 ] 00:23:13.380 }' 00:23:13.380 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:13.380 16:02:18 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:13.947 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:14.206 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:14.206 "name": "raid_bdev1", 00:23:14.206 "uuid": "07f2ee10-c99e-437f-bd14-c6b748c14815", 00:23:14.206 "strip_size_kb": 0, 00:23:14.206 "state": "online", 00:23:14.206 "raid_level": "raid1", 00:23:14.206 "superblock": true, 00:23:14.206 "num_base_bdevs": 2, 00:23:14.206 "num_base_bdevs_discovered": 1, 00:23:14.206 "num_base_bdevs_operational": 1, 00:23:14.206 "base_bdevs_list": [ 00:23:14.206 { 00:23:14.206 "name": null, 00:23:14.206 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:14.206 "is_configured": false, 00:23:14.206 "data_offset": 2048, 00:23:14.206 "data_size": 63488 00:23:14.206 }, 00:23:14.206 { 00:23:14.206 "name": "BaseBdev2", 00:23:14.206 "uuid": "3b238e13-cf4b-5007-8675-548a9c3c4e4a", 00:23:14.207 "is_configured": true, 00:23:14.207 "data_offset": 2048, 00:23:14.207 "data_size": 63488 00:23:14.207 } 00:23:14.207 ] 00:23:14.207 }' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2776154 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 2776154 ']' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 2776154 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2776154 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2776154' 00:23:14.207 killing process with pid 2776154 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 2776154 00:23:14.207 Received shutdown signal, test time was about 27.185310 seconds 00:23:14.207 00:23:14.207 Latency(us) 00:23:14.207 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:14.207 =================================================================================================================== 00:23:14.207 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:23:14.207 [2024-06-10 16:02:19.711518] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:14.207 [2024-06-10 16:02:19.711619] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:14.207 [2024-06-10 16:02:19.711664] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:14.207 [2024-06-10 16:02:19.711674] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x18ec380 name raid_bdev1, state offline 00:23:14.207 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 2776154 00:23:14.466 [2024-06-10 16:02:19.731790] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:14.466 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:23:14.466 00:23:14.466 real 0m32.039s 00:23:14.466 user 0m51.284s 00:23:14.466 sys 0m3.681s 00:23:14.466 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:14.466 16:02:19 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:23:14.466 ************************************ 00:23:14.466 END TEST raid_rebuild_test_sb_io 00:23:14.466 ************************************ 00:23:14.726 16:02:19 bdev_raid -- bdev/bdev_raid.sh@876 -- # for n in 2 4 00:23:14.726 16:02:19 bdev_raid -- bdev/bdev_raid.sh@877 -- # run_test raid_rebuild_test raid_rebuild_test raid1 4 false false true 00:23:14.726 16:02:19 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:23:14.726 16:02:19 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:14.726 16:02:19 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:14.726 ************************************ 00:23:14.726 START TEST raid_rebuild_test 00:23:14.726 ************************************ 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false false true 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@596 -- # raid_pid=2781831 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@597 -- # waitforlisten 2781831 /var/tmp/spdk-raid.sock 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@830 -- # '[' -z 2781831 ']' 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:14.726 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:14.726 16:02:20 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:14.726 [2024-06-10 16:02:20.076784] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:23:14.726 [2024-06-10 16:02:20.076845] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2781831 ] 00:23:14.726 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:14.726 Zero copy mechanism will not be used. 00:23:14.726 [2024-06-10 16:02:20.176161] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:14.985 [2024-06-10 16:02:20.271805] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:14.986 [2024-06-10 16:02:20.333161] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:14.986 [2024-06-10 16:02:20.333201] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:15.554 16:02:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:15.554 16:02:21 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@863 -- # return 0 00:23:15.555 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:15.555 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:15.814 BaseBdev1_malloc 00:23:15.814 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:16.073 [2024-06-10 16:02:21.531735] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:16.073 [2024-06-10 16:02:21.531781] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.073 [2024-06-10 16:02:21.531802] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27a6e90 00:23:16.073 [2024-06-10 16:02:21.531812] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.073 [2024-06-10 16:02:21.533517] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.073 [2024-06-10 16:02:21.533545] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:16.073 BaseBdev1 00:23:16.073 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:16.073 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:16.332 BaseBdev2_malloc 00:23:16.332 16:02:21 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:16.591 [2024-06-10 16:02:22.054232] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:16.591 [2024-06-10 16:02:22.054279] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:16.591 [2024-06-10 16:02:22.054299] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27a79e0 00:23:16.591 [2024-06-10 16:02:22.054309] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:16.591 [2024-06-10 16:02:22.055803] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:16.591 [2024-06-10 16:02:22.055828] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:16.591 BaseBdev2 00:23:16.591 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:16.591 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:16.849 BaseBdev3_malloc 00:23:16.849 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:17.107 [2024-06-10 16:02:22.572046] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:17.107 [2024-06-10 16:02:22.572085] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:17.107 [2024-06-10 16:02:22.572102] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2953e70 00:23:17.107 [2024-06-10 16:02:22.572111] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:17.107 [2024-06-10 16:02:22.573572] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:17.107 [2024-06-10 16:02:22.573599] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:17.107 BaseBdev3 00:23:17.107 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:17.107 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:17.366 BaseBdev4_malloc 00:23:17.366 16:02:22 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:17.626 [2024-06-10 16:02:23.093817] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:17.626 [2024-06-10 16:02:23.093856] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:17.626 [2024-06-10 16:02:23.093878] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2952700 00:23:17.626 [2024-06-10 16:02:23.093888] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:17.626 [2024-06-10 16:02:23.095331] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:17.626 [2024-06-10 16:02:23.095356] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:17.626 BaseBdev4 00:23:17.626 16:02:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:17.885 spare_malloc 00:23:17.885 16:02:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:18.144 spare_delay 00:23:18.144 16:02:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:18.404 [2024-06-10 16:02:23.852278] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:18.404 [2024-06-10 16:02:23.852324] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:18.404 [2024-06-10 16:02:23.852343] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2958710 00:23:18.404 [2024-06-10 16:02:23.852353] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:18.404 [2024-06-10 16:02:23.853877] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:18.404 [2024-06-10 16:02:23.853904] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:18.404 spare 00:23:18.404 16:02:23 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:18.663 [2024-06-10 16:02:24.036788] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:18.663 [2024-06-10 16:02:24.038050] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:18.663 [2024-06-10 16:02:24.038106] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:18.663 [2024-06-10 16:02:24.038153] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:18.663 [2024-06-10 16:02:24.038230] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x28d8260 00:23:18.663 [2024-06-10 16:02:24.038238] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:23:18.663 [2024-06-10 16:02:24.038442] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28db610 00:23:18.663 [2024-06-10 16:02:24.038591] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x28d8260 00:23:18.663 [2024-06-10 16:02:24.038599] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x28d8260 00:23:18.663 [2024-06-10 16:02:24.038709] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:18.663 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:18.923 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:18.923 "name": "raid_bdev1", 00:23:18.923 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:18.923 "strip_size_kb": 0, 00:23:18.923 "state": "online", 00:23:18.923 "raid_level": "raid1", 00:23:18.923 "superblock": false, 00:23:18.923 "num_base_bdevs": 4, 00:23:18.923 "num_base_bdevs_discovered": 4, 00:23:18.923 "num_base_bdevs_operational": 4, 00:23:18.923 "base_bdevs_list": [ 00:23:18.923 { 00:23:18.923 "name": "BaseBdev1", 00:23:18.923 "uuid": "cc2192d0-b265-54b2-b4b8-b0d041a0281a", 00:23:18.923 "is_configured": true, 00:23:18.923 "data_offset": 0, 00:23:18.923 "data_size": 65536 00:23:18.923 }, 00:23:18.923 { 00:23:18.923 "name": "BaseBdev2", 00:23:18.923 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:18.923 "is_configured": true, 00:23:18.923 "data_offset": 0, 00:23:18.923 "data_size": 65536 00:23:18.923 }, 00:23:18.923 { 00:23:18.923 "name": "BaseBdev3", 00:23:18.923 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:18.923 "is_configured": true, 00:23:18.923 "data_offset": 0, 00:23:18.923 "data_size": 65536 00:23:18.923 }, 00:23:18.923 { 00:23:18.923 "name": "BaseBdev4", 00:23:18.923 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:18.923 "is_configured": true, 00:23:18.923 "data_offset": 0, 00:23:18.923 "data_size": 65536 00:23:18.923 } 00:23:18.923 ] 00:23:18.923 }' 00:23:18.923 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:18.923 16:02:24 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:19.491 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:19.491 16:02:24 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:19.749 [2024-06-10 16:02:25.103927] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:19.749 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:23:19.749 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:19.749 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:20.007 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:20.266 [2024-06-10 16:02:25.621070] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28d7e00 00:23:20.266 /dev/nbd0 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:20.266 1+0 records in 00:23:20.266 1+0 records out 00:23:20.266 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000197632 s, 20.7 MB/s 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:20.266 16:02:25 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=65536 oflag=direct 00:23:28.381 65536+0 records in 00:23:28.381 65536+0 records out 00:23:28.381 33554432 bytes (34 MB, 32 MiB) copied, 7.05901 s, 4.8 MB/s 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:28.381 16:02:32 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:28.381 [2024-06-10 16:02:33.021456] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:28.381 [2024-06-10 16:02:33.262147] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:28.381 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:28.381 "name": "raid_bdev1", 00:23:28.381 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:28.381 "strip_size_kb": 0, 00:23:28.381 "state": "online", 00:23:28.381 "raid_level": "raid1", 00:23:28.381 "superblock": false, 00:23:28.381 "num_base_bdevs": 4, 00:23:28.381 "num_base_bdevs_discovered": 3, 00:23:28.381 "num_base_bdevs_operational": 3, 00:23:28.381 "base_bdevs_list": [ 00:23:28.381 { 00:23:28.381 "name": null, 00:23:28.381 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:28.381 "is_configured": false, 00:23:28.381 "data_offset": 0, 00:23:28.381 "data_size": 65536 00:23:28.381 }, 00:23:28.381 { 00:23:28.381 "name": "BaseBdev2", 00:23:28.381 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:28.381 "is_configured": true, 00:23:28.381 "data_offset": 0, 00:23:28.382 "data_size": 65536 00:23:28.382 }, 00:23:28.382 { 00:23:28.382 "name": "BaseBdev3", 00:23:28.382 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:28.382 "is_configured": true, 00:23:28.382 "data_offset": 0, 00:23:28.382 "data_size": 65536 00:23:28.382 }, 00:23:28.382 { 00:23:28.382 "name": "BaseBdev4", 00:23:28.382 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:28.382 "is_configured": true, 00:23:28.382 "data_offset": 0, 00:23:28.382 "data_size": 65536 00:23:28.382 } 00:23:28.382 ] 00:23:28.382 }' 00:23:28.382 16:02:33 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:28.382 16:02:33 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:28.950 16:02:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:28.950 [2024-06-10 16:02:34.393186] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:28.950 [2024-06-10 16:02:34.397181] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x28db5e0 00:23:28.950 [2024-06-10 16:02:34.399346] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:28.950 16:02:34 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:29.955 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.214 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:30.214 "name": "raid_bdev1", 00:23:30.214 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:30.214 "strip_size_kb": 0, 00:23:30.214 "state": "online", 00:23:30.214 "raid_level": "raid1", 00:23:30.214 "superblock": false, 00:23:30.214 "num_base_bdevs": 4, 00:23:30.214 "num_base_bdevs_discovered": 4, 00:23:30.214 "num_base_bdevs_operational": 4, 00:23:30.214 "process": { 00:23:30.214 "type": "rebuild", 00:23:30.214 "target": "spare", 00:23:30.214 "progress": { 00:23:30.214 "blocks": 24576, 00:23:30.214 "percent": 37 00:23:30.214 } 00:23:30.214 }, 00:23:30.214 "base_bdevs_list": [ 00:23:30.214 { 00:23:30.214 "name": "spare", 00:23:30.214 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:30.214 "is_configured": true, 00:23:30.214 "data_offset": 0, 00:23:30.214 "data_size": 65536 00:23:30.214 }, 00:23:30.214 { 00:23:30.214 "name": "BaseBdev2", 00:23:30.214 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:30.214 "is_configured": true, 00:23:30.214 "data_offset": 0, 00:23:30.214 "data_size": 65536 00:23:30.214 }, 00:23:30.214 { 00:23:30.214 "name": "BaseBdev3", 00:23:30.214 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:30.214 "is_configured": true, 00:23:30.214 "data_offset": 0, 00:23:30.214 "data_size": 65536 00:23:30.214 }, 00:23:30.214 { 00:23:30.214 "name": "BaseBdev4", 00:23:30.214 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:30.214 "is_configured": true, 00:23:30.214 "data_offset": 0, 00:23:30.214 "data_size": 65536 00:23:30.214 } 00:23:30.214 ] 00:23:30.214 }' 00:23:30.214 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:30.473 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:30.473 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:30.473 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:30.473 16:02:35 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:30.732 [2024-06-10 16:02:36.014214] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.732 [2024-06-10 16:02:36.112262] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:30.732 [2024-06-10 16:02:36.112305] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:30.732 [2024-06-10 16:02:36.112322] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:30.732 [2024-06-10 16:02:36.112328] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:30.732 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:30.991 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:30.991 "name": "raid_bdev1", 00:23:30.991 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:30.991 "strip_size_kb": 0, 00:23:30.991 "state": "online", 00:23:30.991 "raid_level": "raid1", 00:23:30.991 "superblock": false, 00:23:30.991 "num_base_bdevs": 4, 00:23:30.991 "num_base_bdevs_discovered": 3, 00:23:30.991 "num_base_bdevs_operational": 3, 00:23:30.991 "base_bdevs_list": [ 00:23:30.991 { 00:23:30.991 "name": null, 00:23:30.991 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:30.991 "is_configured": false, 00:23:30.991 "data_offset": 0, 00:23:30.991 "data_size": 65536 00:23:30.991 }, 00:23:30.991 { 00:23:30.991 "name": "BaseBdev2", 00:23:30.991 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:30.991 "is_configured": true, 00:23:30.991 "data_offset": 0, 00:23:30.991 "data_size": 65536 00:23:30.991 }, 00:23:30.991 { 00:23:30.991 "name": "BaseBdev3", 00:23:30.991 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:30.991 "is_configured": true, 00:23:30.991 "data_offset": 0, 00:23:30.991 "data_size": 65536 00:23:30.991 }, 00:23:30.991 { 00:23:30.991 "name": "BaseBdev4", 00:23:30.991 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:30.991 "is_configured": true, 00:23:30.991 "data_offset": 0, 00:23:30.991 "data_size": 65536 00:23:30.991 } 00:23:30.991 ] 00:23:30.991 }' 00:23:30.991 16:02:36 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:30.991 16:02:36 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:31.558 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:31.558 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:31.559 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:31.559 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:31.559 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:31.559 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:31.559 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:31.818 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:31.818 "name": "raid_bdev1", 00:23:31.818 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:31.818 "strip_size_kb": 0, 00:23:31.818 "state": "online", 00:23:31.818 "raid_level": "raid1", 00:23:31.818 "superblock": false, 00:23:31.818 "num_base_bdevs": 4, 00:23:31.818 "num_base_bdevs_discovered": 3, 00:23:31.818 "num_base_bdevs_operational": 3, 00:23:31.818 "base_bdevs_list": [ 00:23:31.818 { 00:23:31.818 "name": null, 00:23:31.818 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:31.818 "is_configured": false, 00:23:31.818 "data_offset": 0, 00:23:31.818 "data_size": 65536 00:23:31.818 }, 00:23:31.818 { 00:23:31.818 "name": "BaseBdev2", 00:23:31.818 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:31.818 "is_configured": true, 00:23:31.818 "data_offset": 0, 00:23:31.818 "data_size": 65536 00:23:31.818 }, 00:23:31.818 { 00:23:31.818 "name": "BaseBdev3", 00:23:31.818 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:31.818 "is_configured": true, 00:23:31.818 "data_offset": 0, 00:23:31.818 "data_size": 65536 00:23:31.818 }, 00:23:31.818 { 00:23:31.818 "name": "BaseBdev4", 00:23:31.818 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:31.818 "is_configured": true, 00:23:31.818 "data_offset": 0, 00:23:31.818 "data_size": 65536 00:23:31.818 } 00:23:31.818 ] 00:23:31.818 }' 00:23:31.818 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:32.077 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:32.077 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:32.077 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:32.077 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:32.336 [2024-06-10 16:02:37.620353] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:32.336 [2024-06-10 16:02:37.624267] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2957240 00:23:32.336 [2024-06-10 16:02:37.625810] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:32.336 16:02:37 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.272 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:33.531 "name": "raid_bdev1", 00:23:33.531 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:33.531 "strip_size_kb": 0, 00:23:33.531 "state": "online", 00:23:33.531 "raid_level": "raid1", 00:23:33.531 "superblock": false, 00:23:33.531 "num_base_bdevs": 4, 00:23:33.531 "num_base_bdevs_discovered": 4, 00:23:33.531 "num_base_bdevs_operational": 4, 00:23:33.531 "process": { 00:23:33.531 "type": "rebuild", 00:23:33.531 "target": "spare", 00:23:33.531 "progress": { 00:23:33.531 "blocks": 24576, 00:23:33.531 "percent": 37 00:23:33.531 } 00:23:33.531 }, 00:23:33.531 "base_bdevs_list": [ 00:23:33.531 { 00:23:33.531 "name": "spare", 00:23:33.531 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:33.531 "is_configured": true, 00:23:33.531 "data_offset": 0, 00:23:33.531 "data_size": 65536 00:23:33.531 }, 00:23:33.531 { 00:23:33.531 "name": "BaseBdev2", 00:23:33.531 "uuid": "df1b1861-0c5c-545d-be6d-4b2b687f4f89", 00:23:33.531 "is_configured": true, 00:23:33.531 "data_offset": 0, 00:23:33.531 "data_size": 65536 00:23:33.531 }, 00:23:33.531 { 00:23:33.531 "name": "BaseBdev3", 00:23:33.531 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:33.531 "is_configured": true, 00:23:33.531 "data_offset": 0, 00:23:33.531 "data_size": 65536 00:23:33.531 }, 00:23:33.531 { 00:23:33.531 "name": "BaseBdev4", 00:23:33.531 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:33.531 "is_configured": true, 00:23:33.531 "data_offset": 0, 00:23:33.531 "data_size": 65536 00:23:33.531 } 00:23:33.531 ] 00:23:33.531 }' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:33.531 16:02:38 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:33.790 [2024-06-10 16:02:39.235396] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:33.790 [2024-06-10 16:02:39.238043] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x2957240 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:33.790 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.050 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:34.050 "name": "raid_bdev1", 00:23:34.050 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:34.050 "strip_size_kb": 0, 00:23:34.050 "state": "online", 00:23:34.050 "raid_level": "raid1", 00:23:34.050 "superblock": false, 00:23:34.050 "num_base_bdevs": 4, 00:23:34.050 "num_base_bdevs_discovered": 3, 00:23:34.050 "num_base_bdevs_operational": 3, 00:23:34.050 "process": { 00:23:34.050 "type": "rebuild", 00:23:34.050 "target": "spare", 00:23:34.050 "progress": { 00:23:34.050 "blocks": 36864, 00:23:34.050 "percent": 56 00:23:34.050 } 00:23:34.050 }, 00:23:34.050 "base_bdevs_list": [ 00:23:34.050 { 00:23:34.050 "name": "spare", 00:23:34.050 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:34.050 "is_configured": true, 00:23:34.050 "data_offset": 0, 00:23:34.050 "data_size": 65536 00:23:34.050 }, 00:23:34.050 { 00:23:34.050 "name": null, 00:23:34.050 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.050 "is_configured": false, 00:23:34.050 "data_offset": 0, 00:23:34.050 "data_size": 65536 00:23:34.050 }, 00:23:34.050 { 00:23:34.051 "name": "BaseBdev3", 00:23:34.051 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:34.051 "is_configured": true, 00:23:34.051 "data_offset": 0, 00:23:34.051 "data_size": 65536 00:23:34.051 }, 00:23:34.051 { 00:23:34.051 "name": "BaseBdev4", 00:23:34.051 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:34.051 "is_configured": true, 00:23:34.051 "data_offset": 0, 00:23:34.051 "data_size": 65536 00:23:34.051 } 00:23:34.051 ] 00:23:34.051 }' 00:23:34.051 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:34.051 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:34.051 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@705 -- # local timeout=881 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:34.310 "name": "raid_bdev1", 00:23:34.310 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:34.310 "strip_size_kb": 0, 00:23:34.310 "state": "online", 00:23:34.310 "raid_level": "raid1", 00:23:34.310 "superblock": false, 00:23:34.310 "num_base_bdevs": 4, 00:23:34.310 "num_base_bdevs_discovered": 3, 00:23:34.310 "num_base_bdevs_operational": 3, 00:23:34.310 "process": { 00:23:34.310 "type": "rebuild", 00:23:34.310 "target": "spare", 00:23:34.310 "progress": { 00:23:34.310 "blocks": 43008, 00:23:34.310 "percent": 65 00:23:34.310 } 00:23:34.310 }, 00:23:34.310 "base_bdevs_list": [ 00:23:34.310 { 00:23:34.310 "name": "spare", 00:23:34.310 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:34.310 "is_configured": true, 00:23:34.310 "data_offset": 0, 00:23:34.310 "data_size": 65536 00:23:34.310 }, 00:23:34.310 { 00:23:34.310 "name": null, 00:23:34.310 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:34.310 "is_configured": false, 00:23:34.310 "data_offset": 0, 00:23:34.310 "data_size": 65536 00:23:34.310 }, 00:23:34.310 { 00:23:34.310 "name": "BaseBdev3", 00:23:34.310 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:34.310 "is_configured": true, 00:23:34.310 "data_offset": 0, 00:23:34.310 "data_size": 65536 00:23:34.310 }, 00:23:34.310 { 00:23:34.310 "name": "BaseBdev4", 00:23:34.310 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:34.310 "is_configured": true, 00:23:34.310 "data_offset": 0, 00:23:34.310 "data_size": 65536 00:23:34.310 } 00:23:34.310 ] 00:23:34.310 }' 00:23:34.310 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:34.569 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:34.569 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:34.569 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:34.569 16:02:39 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:35.506 [2024-06-10 16:02:40.850195] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:35.506 [2024-06-10 16:02:40.850253] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:35.506 [2024-06-10 16:02:40.850289] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.506 16:02:40 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:35.765 "name": "raid_bdev1", 00:23:35.765 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:35.765 "strip_size_kb": 0, 00:23:35.765 "state": "online", 00:23:35.765 "raid_level": "raid1", 00:23:35.765 "superblock": false, 00:23:35.765 "num_base_bdevs": 4, 00:23:35.765 "num_base_bdevs_discovered": 3, 00:23:35.765 "num_base_bdevs_operational": 3, 00:23:35.765 "base_bdevs_list": [ 00:23:35.765 { 00:23:35.765 "name": "spare", 00:23:35.765 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:35.765 "is_configured": true, 00:23:35.765 "data_offset": 0, 00:23:35.765 "data_size": 65536 00:23:35.765 }, 00:23:35.765 { 00:23:35.765 "name": null, 00:23:35.765 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:35.765 "is_configured": false, 00:23:35.765 "data_offset": 0, 00:23:35.765 "data_size": 65536 00:23:35.765 }, 00:23:35.765 { 00:23:35.765 "name": "BaseBdev3", 00:23:35.765 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:35.765 "is_configured": true, 00:23:35.765 "data_offset": 0, 00:23:35.765 "data_size": 65536 00:23:35.765 }, 00:23:35.765 { 00:23:35.765 "name": "BaseBdev4", 00:23:35.765 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:35.765 "is_configured": true, 00:23:35.765 "data_offset": 0, 00:23:35.765 "data_size": 65536 00:23:35.765 } 00:23:35.765 ] 00:23:35.765 }' 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@708 -- # break 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:35.765 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.025 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:36.025 "name": "raid_bdev1", 00:23:36.025 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:36.025 "strip_size_kb": 0, 00:23:36.025 "state": "online", 00:23:36.025 "raid_level": "raid1", 00:23:36.025 "superblock": false, 00:23:36.025 "num_base_bdevs": 4, 00:23:36.025 "num_base_bdevs_discovered": 3, 00:23:36.025 "num_base_bdevs_operational": 3, 00:23:36.025 "base_bdevs_list": [ 00:23:36.025 { 00:23:36.025 "name": "spare", 00:23:36.025 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:36.025 "is_configured": true, 00:23:36.025 "data_offset": 0, 00:23:36.025 "data_size": 65536 00:23:36.025 }, 00:23:36.025 { 00:23:36.025 "name": null, 00:23:36.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.025 "is_configured": false, 00:23:36.025 "data_offset": 0, 00:23:36.025 "data_size": 65536 00:23:36.025 }, 00:23:36.025 { 00:23:36.025 "name": "BaseBdev3", 00:23:36.025 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:36.025 "is_configured": true, 00:23:36.025 "data_offset": 0, 00:23:36.025 "data_size": 65536 00:23:36.025 }, 00:23:36.025 { 00:23:36.025 "name": "BaseBdev4", 00:23:36.025 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:36.025 "is_configured": true, 00:23:36.025 "data_offset": 0, 00:23:36.025 "data_size": 65536 00:23:36.025 } 00:23:36.025 ] 00:23:36.025 }' 00:23:36.025 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:36.284 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:36.543 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:36.543 "name": "raid_bdev1", 00:23:36.543 "uuid": "becf0ce7-5254-4bd9-a2dc-5249a6442582", 00:23:36.543 "strip_size_kb": 0, 00:23:36.543 "state": "online", 00:23:36.543 "raid_level": "raid1", 00:23:36.543 "superblock": false, 00:23:36.543 "num_base_bdevs": 4, 00:23:36.543 "num_base_bdevs_discovered": 3, 00:23:36.543 "num_base_bdevs_operational": 3, 00:23:36.543 "base_bdevs_list": [ 00:23:36.543 { 00:23:36.543 "name": "spare", 00:23:36.543 "uuid": "d87af35a-2334-55c4-8416-99e64fa128b9", 00:23:36.543 "is_configured": true, 00:23:36.543 "data_offset": 0, 00:23:36.543 "data_size": 65536 00:23:36.543 }, 00:23:36.543 { 00:23:36.543 "name": null, 00:23:36.543 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:36.543 "is_configured": false, 00:23:36.543 "data_offset": 0, 00:23:36.543 "data_size": 65536 00:23:36.543 }, 00:23:36.543 { 00:23:36.543 "name": "BaseBdev3", 00:23:36.543 "uuid": "1da1360f-8d74-5106-bb6d-54808104039f", 00:23:36.543 "is_configured": true, 00:23:36.543 "data_offset": 0, 00:23:36.543 "data_size": 65536 00:23:36.543 }, 00:23:36.543 { 00:23:36.543 "name": "BaseBdev4", 00:23:36.543 "uuid": "a842f74f-5e4d-5b11-bb16-835697b9d139", 00:23:36.543 "is_configured": true, 00:23:36.543 "data_offset": 0, 00:23:36.543 "data_size": 65536 00:23:36.543 } 00:23:36.543 ] 00:23:36.543 }' 00:23:36.543 16:02:41 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:36.543 16:02:41 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:37.109 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:23:37.368 [2024-06-10 16:02:42.714789] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:23:37.368 [2024-06-10 16:02:42.714814] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:23:37.368 [2024-06-10 16:02:42.714871] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:23:37.368 [2024-06-10 16:02:42.714940] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:23:37.368 [2024-06-10 16:02:42.714949] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x28d8260 name raid_bdev1, state offline 00:23:37.368 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:37.368 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # jq length 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@12 -- # local i 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:37.627 16:02:42 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:23:37.886 /dev/nbd0 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:37.886 1+0 records in 00:23:37.886 1+0 records out 00:23:37.886 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000245031 s, 16.7 MB/s 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:37.886 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:23:38.145 /dev/nbd1 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@868 -- # local i 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@872 -- # break 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:38.145 1+0 records in 00:23:38.145 1+0 records out 00:23:38.145 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000242166 s, 16.9 MB/s 00:23:38.145 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@885 -- # size=4096 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@888 -- # return 0 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@737 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@51 -- # local i 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:38.146 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:38.406 16:02:43 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@41 -- # break 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/nbd_common.sh@45 -- # return 0 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@782 -- # killprocess 2781831 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@949 -- # '[' -z 2781831 ']' 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@953 -- # kill -0 2781831 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # uname 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2781831 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2781831' 00:23:38.975 killing process with pid 2781831 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@968 -- # kill 2781831 00:23:38.975 Received shutdown signal, test time was about 60.000000 seconds 00:23:38.975 00:23:38.975 Latency(us) 00:23:38.975 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:23:38.975 =================================================================================================================== 00:23:38.975 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:23:38.975 [2024-06-10 16:02:44.229992] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@973 -- # wait 2781831 00:23:38.975 [2024-06-10 16:02:44.272401] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- bdev/bdev_raid.sh@784 -- # return 0 00:23:38.975 00:23:38.975 real 0m24.458s 00:23:38.975 user 0m34.356s 00:23:38.975 sys 0m4.082s 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@1125 -- # xtrace_disable 00:23:38.975 16:02:44 bdev_raid.raid_rebuild_test -- common/autotest_common.sh@10 -- # set +x 00:23:38.975 ************************************ 00:23:38.975 END TEST raid_rebuild_test 00:23:38.975 ************************************ 00:23:39.234 16:02:44 bdev_raid -- bdev/bdev_raid.sh@878 -- # run_test raid_rebuild_test_sb raid_rebuild_test raid1 4 true false true 00:23:39.234 16:02:44 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:23:39.234 16:02:44 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:23:39.234 16:02:44 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:23:39.234 ************************************ 00:23:39.234 START TEST raid_rebuild_test_sb 00:23:39.234 ************************************ 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true false true 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@572 -- # local verify=true 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@575 -- # local strip_size 00:23:39.234 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@576 -- # local create_arg 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@578 -- # local data_offset 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@596 -- # raid_pid=2785932 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@597 -- # waitforlisten 2785932 /var/tmp/spdk-raid.sock 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@830 -- # '[' -z 2785932 ']' 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@835 -- # local max_retries=100 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:23:39.235 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@839 -- # xtrace_disable 00:23:39.235 16:02:44 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:39.235 [2024-06-10 16:02:44.606275] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:23:39.235 [2024-06-10 16:02:44.606327] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2785932 ] 00:23:39.235 I/O size of 3145728 is greater than zero copy threshold (65536). 00:23:39.235 Zero copy mechanism will not be used. 00:23:39.235 [2024-06-10 16:02:44.704145] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:23:39.494 [2024-06-10 16:02:44.798595] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:23:39.494 [2024-06-10 16:02:44.853508] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:39.494 [2024-06-10 16:02:44.853535] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:23:40.060 16:02:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:23:40.060 16:02:45 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@863 -- # return 0 00:23:40.060 16:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:40.060 16:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:23:40.319 BaseBdev1_malloc 00:23:40.319 16:02:45 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:23:40.577 [2024-06-10 16:02:46.054174] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:23:40.577 [2024-06-10 16:02:46.054216] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:40.577 [2024-06-10 16:02:46.054238] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e96e90 00:23:40.577 [2024-06-10 16:02:46.054247] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:40.577 [2024-06-10 16:02:46.055974] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:40.577 [2024-06-10 16:02:46.056003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:23:40.577 BaseBdev1 00:23:40.577 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:40.577 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:23:40.834 BaseBdev2_malloc 00:23:40.834 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:23:41.096 [2024-06-10 16:02:46.568369] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:23:41.096 [2024-06-10 16:02:46.568407] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:41.096 [2024-06-10 16:02:46.568428] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e979e0 00:23:41.096 [2024-06-10 16:02:46.568438] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:41.096 [2024-06-10 16:02:46.570068] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:41.096 [2024-06-10 16:02:46.570095] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:23:41.096 BaseBdev2 00:23:41.096 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:41.096 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:23:41.353 BaseBdev3_malloc 00:23:41.353 16:02:46 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:23:41.611 [2024-06-10 16:02:47.094208] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:23:41.611 [2024-06-10 16:02:47.094249] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:41.611 [2024-06-10 16:02:47.094267] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2043e70 00:23:41.611 [2024-06-10 16:02:47.094277] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:41.611 [2024-06-10 16:02:47.095838] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:41.611 [2024-06-10 16:02:47.095866] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:23:41.611 BaseBdev3 00:23:41.611 16:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:23:41.611 16:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:23:41.869 BaseBdev4_malloc 00:23:41.869 16:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:23:42.128 [2024-06-10 16:02:47.604039] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:23:42.128 [2024-06-10 16:02:47.604080] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.128 [2024-06-10 16:02:47.604098] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2042700 00:23:42.128 [2024-06-10 16:02:47.604108] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.128 [2024-06-10 16:02:47.605685] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.128 [2024-06-10 16:02:47.605711] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:23:42.128 BaseBdev4 00:23:42.128 16:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:23:42.386 spare_malloc 00:23:42.386 16:02:47 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:23:42.644 spare_delay 00:23:42.644 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:23:42.902 [2024-06-10 16:02:48.366502] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:23:42.903 [2024-06-10 16:02:48.366543] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:23:42.903 [2024-06-10 16:02:48.366564] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2048710 00:23:42.903 [2024-06-10 16:02:48.366573] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:23:42.903 [2024-06-10 16:02:48.368146] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:23:42.903 [2024-06-10 16:02:48.368175] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:23:42.903 spare 00:23:42.903 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:23:43.162 [2024-06-10 16:02:48.607296] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:23:43.162 [2024-06-10 16:02:48.608605] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:23:43.162 [2024-06-10 16:02:48.608663] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:23:43.162 [2024-06-10 16:02:48.608710] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:23:43.162 [2024-06-10 16:02:48.608899] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fc8260 00:23:43.162 [2024-06-10 16:02:48.608909] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:23:43.162 [2024-06-10 16:02:48.609117] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fc8200 00:23:43.162 [2024-06-10 16:02:48.609270] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fc8260 00:23:43.162 [2024-06-10 16:02:48.609279] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fc8260 00:23:43.162 [2024-06-10 16:02:48.609374] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:43.162 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:43.421 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:43.421 "name": "raid_bdev1", 00:23:43.421 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:43.421 "strip_size_kb": 0, 00:23:43.421 "state": "online", 00:23:43.421 "raid_level": "raid1", 00:23:43.421 "superblock": true, 00:23:43.421 "num_base_bdevs": 4, 00:23:43.421 "num_base_bdevs_discovered": 4, 00:23:43.421 "num_base_bdevs_operational": 4, 00:23:43.421 "base_bdevs_list": [ 00:23:43.421 { 00:23:43.421 "name": "BaseBdev1", 00:23:43.421 "uuid": "a48fe035-ea72-56c2-8f1a-fb6170417eae", 00:23:43.421 "is_configured": true, 00:23:43.421 "data_offset": 2048, 00:23:43.421 "data_size": 63488 00:23:43.421 }, 00:23:43.422 { 00:23:43.422 "name": "BaseBdev2", 00:23:43.422 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:43.422 "is_configured": true, 00:23:43.422 "data_offset": 2048, 00:23:43.422 "data_size": 63488 00:23:43.422 }, 00:23:43.422 { 00:23:43.422 "name": "BaseBdev3", 00:23:43.422 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:43.422 "is_configured": true, 00:23:43.422 "data_offset": 2048, 00:23:43.422 "data_size": 63488 00:23:43.422 }, 00:23:43.422 { 00:23:43.422 "name": "BaseBdev4", 00:23:43.422 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:43.422 "is_configured": true, 00:23:43.422 "data_offset": 2048, 00:23:43.422 "data_size": 63488 00:23:43.422 } 00:23:43.422 ] 00:23:43.422 }' 00:23:43.422 16:02:48 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:43.422 16:02:48 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:44.061 16:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:23:44.061 16:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:23:44.320 [2024-06-10 16:02:49.742578] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:23:44.320 16:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:23:44.320 16:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:44.320 16:02:49 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:23:44.578 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:23:44.578 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:23:44.578 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:44.579 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:23:44.838 [2024-06-10 16:02:50.259727] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e965a0 00:23:44.838 /dev/nbd0 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:23:44.838 1+0 records in 00:23:44.838 1+0 records out 00:23:44.838 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000229116 s, 17.9 MB/s 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:23:44.838 16:02:50 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=512 count=63488 oflag=direct 00:23:52.956 63488+0 records in 00:23:52.956 63488+0 records out 00:23:52.956 32505856 bytes (33 MB, 31 MiB) copied, 6.99739 s, 4.6 MB/s 00:23:52.956 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:23:52.956 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:23:52.957 [2024-06-10 16:02:57.508103] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:23:52.957 [2024-06-10 16:02:57.740745] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:52.957 "name": "raid_bdev1", 00:23:52.957 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:52.957 "strip_size_kb": 0, 00:23:52.957 "state": "online", 00:23:52.957 "raid_level": "raid1", 00:23:52.957 "superblock": true, 00:23:52.957 "num_base_bdevs": 4, 00:23:52.957 "num_base_bdevs_discovered": 3, 00:23:52.957 "num_base_bdevs_operational": 3, 00:23:52.957 "base_bdevs_list": [ 00:23:52.957 { 00:23:52.957 "name": null, 00:23:52.957 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:52.957 "is_configured": false, 00:23:52.957 "data_offset": 2048, 00:23:52.957 "data_size": 63488 00:23:52.957 }, 00:23:52.957 { 00:23:52.957 "name": "BaseBdev2", 00:23:52.957 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:52.957 "is_configured": true, 00:23:52.957 "data_offset": 2048, 00:23:52.957 "data_size": 63488 00:23:52.957 }, 00:23:52.957 { 00:23:52.957 "name": "BaseBdev3", 00:23:52.957 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:52.957 "is_configured": true, 00:23:52.957 "data_offset": 2048, 00:23:52.957 "data_size": 63488 00:23:52.957 }, 00:23:52.957 { 00:23:52.957 "name": "BaseBdev4", 00:23:52.957 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:52.957 "is_configured": true, 00:23:52.957 "data_offset": 2048, 00:23:52.957 "data_size": 63488 00:23:52.957 } 00:23:52.957 ] 00:23:52.957 }' 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:52.957 16:02:57 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:53.216 16:02:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:53.216 [2024-06-10 16:02:58.719372] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:53.216 [2024-06-10 16:02:58.723299] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1e96620 00:23:53.216 [2024-06-10 16:02:58.725476] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:53.474 16:02:58 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@646 -- # sleep 1 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.412 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:54.670 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:54.670 "name": "raid_bdev1", 00:23:54.670 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:54.670 "strip_size_kb": 0, 00:23:54.670 "state": "online", 00:23:54.670 "raid_level": "raid1", 00:23:54.670 "superblock": true, 00:23:54.670 "num_base_bdevs": 4, 00:23:54.670 "num_base_bdevs_discovered": 4, 00:23:54.670 "num_base_bdevs_operational": 4, 00:23:54.670 "process": { 00:23:54.670 "type": "rebuild", 00:23:54.670 "target": "spare", 00:23:54.670 "progress": { 00:23:54.670 "blocks": 22528, 00:23:54.670 "percent": 35 00:23:54.670 } 00:23:54.670 }, 00:23:54.670 "base_bdevs_list": [ 00:23:54.670 { 00:23:54.670 "name": "spare", 00:23:54.670 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:23:54.670 "is_configured": true, 00:23:54.670 "data_offset": 2048, 00:23:54.670 "data_size": 63488 00:23:54.670 }, 00:23:54.670 { 00:23:54.670 "name": "BaseBdev2", 00:23:54.670 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:54.670 "is_configured": true, 00:23:54.670 "data_offset": 2048, 00:23:54.670 "data_size": 63488 00:23:54.670 }, 00:23:54.670 { 00:23:54.670 "name": "BaseBdev3", 00:23:54.670 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:54.670 "is_configured": true, 00:23:54.670 "data_offset": 2048, 00:23:54.670 "data_size": 63488 00:23:54.670 }, 00:23:54.670 { 00:23:54.670 "name": "BaseBdev4", 00:23:54.670 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:54.670 "is_configured": true, 00:23:54.670 "data_offset": 2048, 00:23:54.670 "data_size": 63488 00:23:54.670 } 00:23:54.670 ] 00:23:54.670 }' 00:23:54.670 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:54.670 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:54.670 16:02:59 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:54.670 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:54.670 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:23:54.929 [2024-06-10 16:03:00.258303] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:54.929 [2024-06-10 16:03:00.337664] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:23:54.929 [2024-06-10 16:03:00.337705] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:54.929 [2024-06-10 16:03:00.337721] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:23:54.929 [2024-06-10 16:03:00.337727] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:54.929 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:55.186 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:23:55.186 "name": "raid_bdev1", 00:23:55.186 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:55.186 "strip_size_kb": 0, 00:23:55.186 "state": "online", 00:23:55.186 "raid_level": "raid1", 00:23:55.186 "superblock": true, 00:23:55.186 "num_base_bdevs": 4, 00:23:55.186 "num_base_bdevs_discovered": 3, 00:23:55.186 "num_base_bdevs_operational": 3, 00:23:55.186 "base_bdevs_list": [ 00:23:55.186 { 00:23:55.186 "name": null, 00:23:55.186 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:55.186 "is_configured": false, 00:23:55.186 "data_offset": 2048, 00:23:55.186 "data_size": 63488 00:23:55.186 }, 00:23:55.186 { 00:23:55.186 "name": "BaseBdev2", 00:23:55.186 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:55.186 "is_configured": true, 00:23:55.186 "data_offset": 2048, 00:23:55.186 "data_size": 63488 00:23:55.186 }, 00:23:55.186 { 00:23:55.186 "name": "BaseBdev3", 00:23:55.186 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:55.186 "is_configured": true, 00:23:55.186 "data_offset": 2048, 00:23:55.186 "data_size": 63488 00:23:55.186 }, 00:23:55.186 { 00:23:55.186 "name": "BaseBdev4", 00:23:55.186 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:55.186 "is_configured": true, 00:23:55.186 "data_offset": 2048, 00:23:55.186 "data_size": 63488 00:23:55.186 } 00:23:55.186 ] 00:23:55.186 }' 00:23:55.186 16:03:00 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:23:55.186 16:03:00 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:23:55.752 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:23:55.752 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:55.753 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:23:55.753 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:23:55.753 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:55.753 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:55.753 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:56.011 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:56.011 "name": "raid_bdev1", 00:23:56.011 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:56.011 "strip_size_kb": 0, 00:23:56.011 "state": "online", 00:23:56.011 "raid_level": "raid1", 00:23:56.011 "superblock": true, 00:23:56.011 "num_base_bdevs": 4, 00:23:56.011 "num_base_bdevs_discovered": 3, 00:23:56.011 "num_base_bdevs_operational": 3, 00:23:56.011 "base_bdevs_list": [ 00:23:56.011 { 00:23:56.011 "name": null, 00:23:56.011 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:56.011 "is_configured": false, 00:23:56.011 "data_offset": 2048, 00:23:56.011 "data_size": 63488 00:23:56.011 }, 00:23:56.011 { 00:23:56.011 "name": "BaseBdev2", 00:23:56.011 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:56.011 "is_configured": true, 00:23:56.011 "data_offset": 2048, 00:23:56.011 "data_size": 63488 00:23:56.011 }, 00:23:56.011 { 00:23:56.011 "name": "BaseBdev3", 00:23:56.011 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:56.011 "is_configured": true, 00:23:56.011 "data_offset": 2048, 00:23:56.011 "data_size": 63488 00:23:56.012 }, 00:23:56.012 { 00:23:56.012 "name": "BaseBdev4", 00:23:56.012 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:56.012 "is_configured": true, 00:23:56.012 "data_offset": 2048, 00:23:56.012 "data_size": 63488 00:23:56.012 } 00:23:56.012 ] 00:23:56.012 }' 00:23:56.012 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:56.012 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:23:56.012 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:56.270 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:23:56.270 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:23:56.270 [2024-06-10 16:03:01.765531] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:23:56.270 [2024-06-10 16:03:01.769538] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x20410e0 00:23:56.270 [2024-06-10 16:03:01.771098] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:23:56.529 16:03:01 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@662 -- # sleep 1 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:57.473 16:03:02 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:57.731 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:57.731 "name": "raid_bdev1", 00:23:57.731 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:57.731 "strip_size_kb": 0, 00:23:57.731 "state": "online", 00:23:57.732 "raid_level": "raid1", 00:23:57.732 "superblock": true, 00:23:57.732 "num_base_bdevs": 4, 00:23:57.732 "num_base_bdevs_discovered": 4, 00:23:57.732 "num_base_bdevs_operational": 4, 00:23:57.732 "process": { 00:23:57.732 "type": "rebuild", 00:23:57.732 "target": "spare", 00:23:57.732 "progress": { 00:23:57.732 "blocks": 24576, 00:23:57.732 "percent": 38 00:23:57.732 } 00:23:57.732 }, 00:23:57.732 "base_bdevs_list": [ 00:23:57.732 { 00:23:57.732 "name": "spare", 00:23:57.732 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:23:57.732 "is_configured": true, 00:23:57.732 "data_offset": 2048, 00:23:57.732 "data_size": 63488 00:23:57.732 }, 00:23:57.732 { 00:23:57.732 "name": "BaseBdev2", 00:23:57.732 "uuid": "c20cb812-cf6f-522f-8562-02ac8302504a", 00:23:57.732 "is_configured": true, 00:23:57.732 "data_offset": 2048, 00:23:57.732 "data_size": 63488 00:23:57.732 }, 00:23:57.732 { 00:23:57.732 "name": "BaseBdev3", 00:23:57.732 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:57.732 "is_configured": true, 00:23:57.732 "data_offset": 2048, 00:23:57.732 "data_size": 63488 00:23:57.732 }, 00:23:57.732 { 00:23:57.732 "name": "BaseBdev4", 00:23:57.732 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:57.732 "is_configured": true, 00:23:57.732 "data_offset": 2048, 00:23:57.732 "data_size": 63488 00:23:57.732 } 00:23:57.732 ] 00:23:57.732 }' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:23:57.732 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:23:57.732 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:23:57.990 [2024-06-10 16:03:03.391307] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:23:58.248 [2024-06-10 16:03:03.584290] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x20410e0 00:23:58.248 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:23:58.248 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:23:58.248 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:58.248 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.249 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:58.249 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:58.249 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.249 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.249 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.507 "name": "raid_bdev1", 00:23:58.507 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:58.507 "strip_size_kb": 0, 00:23:58.507 "state": "online", 00:23:58.507 "raid_level": "raid1", 00:23:58.507 "superblock": true, 00:23:58.507 "num_base_bdevs": 4, 00:23:58.507 "num_base_bdevs_discovered": 3, 00:23:58.507 "num_base_bdevs_operational": 3, 00:23:58.507 "process": { 00:23:58.507 "type": "rebuild", 00:23:58.507 "target": "spare", 00:23:58.507 "progress": { 00:23:58.507 "blocks": 38912, 00:23:58.507 "percent": 61 00:23:58.507 } 00:23:58.507 }, 00:23:58.507 "base_bdevs_list": [ 00:23:58.507 { 00:23:58.507 "name": "spare", 00:23:58.507 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:23:58.507 "is_configured": true, 00:23:58.507 "data_offset": 2048, 00:23:58.507 "data_size": 63488 00:23:58.507 }, 00:23:58.507 { 00:23:58.507 "name": null, 00:23:58.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.507 "is_configured": false, 00:23:58.507 "data_offset": 2048, 00:23:58.507 "data_size": 63488 00:23:58.507 }, 00:23:58.507 { 00:23:58.507 "name": "BaseBdev3", 00:23:58.507 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:58.507 "is_configured": true, 00:23:58.507 "data_offset": 2048, 00:23:58.507 "data_size": 63488 00:23:58.507 }, 00:23:58.507 { 00:23:58.507 "name": "BaseBdev4", 00:23:58.507 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:58.507 "is_configured": true, 00:23:58.507 "data_offset": 2048, 00:23:58.507 "data_size": 63488 00:23:58.507 } 00:23:58.507 ] 00:23:58.507 }' 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@705 -- # local timeout=905 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:58.507 16:03:03 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:23:58.766 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:23:58.766 "name": "raid_bdev1", 00:23:58.766 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:23:58.766 "strip_size_kb": 0, 00:23:58.766 "state": "online", 00:23:58.766 "raid_level": "raid1", 00:23:58.766 "superblock": true, 00:23:58.766 "num_base_bdevs": 4, 00:23:58.766 "num_base_bdevs_discovered": 3, 00:23:58.766 "num_base_bdevs_operational": 3, 00:23:58.766 "process": { 00:23:58.766 "type": "rebuild", 00:23:58.766 "target": "spare", 00:23:58.766 "progress": { 00:23:58.766 "blocks": 47104, 00:23:58.766 "percent": 74 00:23:58.766 } 00:23:58.766 }, 00:23:58.766 "base_bdevs_list": [ 00:23:58.766 { 00:23:58.766 "name": "spare", 00:23:58.766 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:23:58.766 "is_configured": true, 00:23:58.766 "data_offset": 2048, 00:23:58.766 "data_size": 63488 00:23:58.766 }, 00:23:58.766 { 00:23:58.766 "name": null, 00:23:58.766 "uuid": "00000000-0000-0000-0000-000000000000", 00:23:58.766 "is_configured": false, 00:23:58.766 "data_offset": 2048, 00:23:58.766 "data_size": 63488 00:23:58.766 }, 00:23:58.766 { 00:23:58.766 "name": "BaseBdev3", 00:23:58.766 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:23:58.766 "is_configured": true, 00:23:58.766 "data_offset": 2048, 00:23:58.766 "data_size": 63488 00:23:58.766 }, 00:23:58.766 { 00:23:58.766 "name": "BaseBdev4", 00:23:58.766 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:23:58.766 "is_configured": true, 00:23:58.766 "data_offset": 2048, 00:23:58.766 "data_size": 63488 00:23:58.766 } 00:23:58.766 ] 00:23:58.766 }' 00:23:58.766 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:23:58.766 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:23:58.766 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:23:59.024 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:23:59.024 16:03:04 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@710 -- # sleep 1 00:23:59.591 [2024-06-10 16:03:04.994897] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:23:59.591 [2024-06-10 16:03:04.994961] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:23:59.591 [2024-06-10 16:03:04.995059] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:23:59.849 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.107 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.107 "name": "raid_bdev1", 00:24:00.107 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:00.107 "strip_size_kb": 0, 00:24:00.107 "state": "online", 00:24:00.107 "raid_level": "raid1", 00:24:00.107 "superblock": true, 00:24:00.107 "num_base_bdevs": 4, 00:24:00.107 "num_base_bdevs_discovered": 3, 00:24:00.107 "num_base_bdevs_operational": 3, 00:24:00.107 "base_bdevs_list": [ 00:24:00.107 { 00:24:00.107 "name": "spare", 00:24:00.107 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:00.107 "is_configured": true, 00:24:00.107 "data_offset": 2048, 00:24:00.107 "data_size": 63488 00:24:00.107 }, 00:24:00.107 { 00:24:00.107 "name": null, 00:24:00.107 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.107 "is_configured": false, 00:24:00.107 "data_offset": 2048, 00:24:00.107 "data_size": 63488 00:24:00.107 }, 00:24:00.107 { 00:24:00.107 "name": "BaseBdev3", 00:24:00.107 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:00.107 "is_configured": true, 00:24:00.107 "data_offset": 2048, 00:24:00.107 "data_size": 63488 00:24:00.107 }, 00:24:00.107 { 00:24:00.107 "name": "BaseBdev4", 00:24:00.107 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:00.107 "is_configured": true, 00:24:00.107 "data_offset": 2048, 00:24:00.107 "data_size": 63488 00:24:00.107 } 00:24:00.107 ] 00:24:00.107 }' 00:24:00.107 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@708 -- # break 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.366 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.625 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:00.625 "name": "raid_bdev1", 00:24:00.625 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:00.625 "strip_size_kb": 0, 00:24:00.625 "state": "online", 00:24:00.625 "raid_level": "raid1", 00:24:00.625 "superblock": true, 00:24:00.625 "num_base_bdevs": 4, 00:24:00.625 "num_base_bdevs_discovered": 3, 00:24:00.625 "num_base_bdevs_operational": 3, 00:24:00.625 "base_bdevs_list": [ 00:24:00.625 { 00:24:00.625 "name": "spare", 00:24:00.625 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:00.625 "is_configured": true, 00:24:00.625 "data_offset": 2048, 00:24:00.625 "data_size": 63488 00:24:00.625 }, 00:24:00.625 { 00:24:00.625 "name": null, 00:24:00.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.625 "is_configured": false, 00:24:00.625 "data_offset": 2048, 00:24:00.625 "data_size": 63488 00:24:00.625 }, 00:24:00.625 { 00:24:00.625 "name": "BaseBdev3", 00:24:00.625 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:00.625 "is_configured": true, 00:24:00.625 "data_offset": 2048, 00:24:00.625 "data_size": 63488 00:24:00.625 }, 00:24:00.625 { 00:24:00.625 "name": "BaseBdev4", 00:24:00.625 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:00.625 "is_configured": true, 00:24:00.625 "data_offset": 2048, 00:24:00.625 "data_size": 63488 00:24:00.625 } 00:24:00.625 ] 00:24:00.625 }' 00:24:00.625 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:00.625 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:00.625 16:03:05 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:00.625 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:00.884 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:00.884 "name": "raid_bdev1", 00:24:00.884 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:00.884 "strip_size_kb": 0, 00:24:00.884 "state": "online", 00:24:00.884 "raid_level": "raid1", 00:24:00.884 "superblock": true, 00:24:00.884 "num_base_bdevs": 4, 00:24:00.884 "num_base_bdevs_discovered": 3, 00:24:00.884 "num_base_bdevs_operational": 3, 00:24:00.884 "base_bdevs_list": [ 00:24:00.884 { 00:24:00.884 "name": "spare", 00:24:00.884 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:00.884 "is_configured": true, 00:24:00.884 "data_offset": 2048, 00:24:00.884 "data_size": 63488 00:24:00.884 }, 00:24:00.884 { 00:24:00.884 "name": null, 00:24:00.884 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:00.884 "is_configured": false, 00:24:00.884 "data_offset": 2048, 00:24:00.884 "data_size": 63488 00:24:00.884 }, 00:24:00.884 { 00:24:00.884 "name": "BaseBdev3", 00:24:00.884 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:00.884 "is_configured": true, 00:24:00.884 "data_offset": 2048, 00:24:00.884 "data_size": 63488 00:24:00.884 }, 00:24:00.884 { 00:24:00.884 "name": "BaseBdev4", 00:24:00.884 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:00.884 "is_configured": true, 00:24:00.884 "data_offset": 2048, 00:24:00.884 "data_size": 63488 00:24:00.884 } 00:24:00.884 ] 00:24:00.884 }' 00:24:00.884 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:00.884 16:03:06 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:01.452 16:03:06 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:01.711 [2024-06-10 16:03:07.132685] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:01.711 [2024-06-10 16:03:07.132710] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:01.711 [2024-06-10 16:03:07.132765] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:01.711 [2024-06-10 16:03:07.132837] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:01.711 [2024-06-10 16:03:07.132847] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fc8260 name raid_bdev1, state offline 00:24:01.711 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:01.711 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # jq length 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@12 -- # local i 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:01.969 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:24:02.228 /dev/nbd0 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:02.228 1+0 records in 00:24:02.228 1+0 records out 00:24:02.228 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000224313 s, 18.3 MB/s 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.228 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:02.229 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:24:02.488 /dev/nbd1 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@868 -- # local i 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@872 -- # break 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:02.488 1+0 records in 00:24:02.488 1+0 records out 00:24:02.488 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000297965 s, 13.7 MB/s 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@885 -- # size=4096 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@888 -- # return 0 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@51 -- # local i 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:02.488 16:03:07 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:02.747 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@41 -- # break 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/nbd_common.sh@45 -- # return 0 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:03.006 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:03.265 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:03.578 [2024-06-10 16:03:08.779632] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:03.578 [2024-06-10 16:03:08.779676] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:03.578 [2024-06-10 16:03:08.779696] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1fcbdc0 00:24:03.578 [2024-06-10 16:03:08.779716] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:03.578 [2024-06-10 16:03:08.781433] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:03.578 [2024-06-10 16:03:08.781462] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:03.578 [2024-06-10 16:03:08.781536] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:03.578 [2024-06-10 16:03:08.781562] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:03.578 [2024-06-10 16:03:08.781667] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:03.578 [2024-06-10 16:03:08.781743] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:03.578 spare 00:24:03.578 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:03.578 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:03.578 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:03.579 [2024-06-10 16:03:08.882067] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1fcc8f0 00:24:03.579 [2024-06-10 16:03:08.882087] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:03.579 [2024-06-10 16:03:08.882313] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fcc050 00:24:03.579 [2024-06-10 16:03:08.882480] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1fcc8f0 00:24:03.579 [2024-06-10 16:03:08.882489] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1fcc8f0 00:24:03.579 [2024-06-10 16:03:08.882602] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:03.579 "name": "raid_bdev1", 00:24:03.579 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:03.579 "strip_size_kb": 0, 00:24:03.579 "state": "online", 00:24:03.579 "raid_level": "raid1", 00:24:03.579 "superblock": true, 00:24:03.579 "num_base_bdevs": 4, 00:24:03.579 "num_base_bdevs_discovered": 3, 00:24:03.579 "num_base_bdevs_operational": 3, 00:24:03.579 "base_bdevs_list": [ 00:24:03.579 { 00:24:03.579 "name": "spare", 00:24:03.579 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:03.579 "is_configured": true, 00:24:03.579 "data_offset": 2048, 00:24:03.579 "data_size": 63488 00:24:03.579 }, 00:24:03.579 { 00:24:03.579 "name": null, 00:24:03.579 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:03.579 "is_configured": false, 00:24:03.579 "data_offset": 2048, 00:24:03.579 "data_size": 63488 00:24:03.579 }, 00:24:03.579 { 00:24:03.579 "name": "BaseBdev3", 00:24:03.579 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:03.579 "is_configured": true, 00:24:03.579 "data_offset": 2048, 00:24:03.579 "data_size": 63488 00:24:03.579 }, 00:24:03.579 { 00:24:03.579 "name": "BaseBdev4", 00:24:03.579 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:03.579 "is_configured": true, 00:24:03.579 "data_offset": 2048, 00:24:03.579 "data_size": 63488 00:24:03.579 } 00:24:03.579 ] 00:24:03.579 }' 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:03.579 16:03:08 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.160 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:04.419 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:04.419 "name": "raid_bdev1", 00:24:04.419 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:04.419 "strip_size_kb": 0, 00:24:04.419 "state": "online", 00:24:04.419 "raid_level": "raid1", 00:24:04.419 "superblock": true, 00:24:04.419 "num_base_bdevs": 4, 00:24:04.419 "num_base_bdevs_discovered": 3, 00:24:04.419 "num_base_bdevs_operational": 3, 00:24:04.419 "base_bdevs_list": [ 00:24:04.419 { 00:24:04.419 "name": "spare", 00:24:04.419 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:04.419 "is_configured": true, 00:24:04.419 "data_offset": 2048, 00:24:04.419 "data_size": 63488 00:24:04.419 }, 00:24:04.419 { 00:24:04.419 "name": null, 00:24:04.419 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:04.419 "is_configured": false, 00:24:04.419 "data_offset": 2048, 00:24:04.419 "data_size": 63488 00:24:04.419 }, 00:24:04.419 { 00:24:04.419 "name": "BaseBdev3", 00:24:04.419 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:04.419 "is_configured": true, 00:24:04.419 "data_offset": 2048, 00:24:04.419 "data_size": 63488 00:24:04.419 }, 00:24:04.419 { 00:24:04.419 "name": "BaseBdev4", 00:24:04.419 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:04.419 "is_configured": true, 00:24:04.419 "data_offset": 2048, 00:24:04.419 "data_size": 63488 00:24:04.419 } 00:24:04.419 ] 00:24:04.419 }' 00:24:04.419 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:04.419 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:04.419 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:04.678 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:04.678 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:04.678 16:03:09 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:04.936 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:04.936 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:04.936 [2024-06-10 16:03:10.444317] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:05.195 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:05.454 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:05.454 "name": "raid_bdev1", 00:24:05.454 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:05.454 "strip_size_kb": 0, 00:24:05.454 "state": "online", 00:24:05.454 "raid_level": "raid1", 00:24:05.454 "superblock": true, 00:24:05.454 "num_base_bdevs": 4, 00:24:05.454 "num_base_bdevs_discovered": 2, 00:24:05.454 "num_base_bdevs_operational": 2, 00:24:05.454 "base_bdevs_list": [ 00:24:05.454 { 00:24:05.454 "name": null, 00:24:05.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.454 "is_configured": false, 00:24:05.454 "data_offset": 2048, 00:24:05.454 "data_size": 63488 00:24:05.454 }, 00:24:05.454 { 00:24:05.454 "name": null, 00:24:05.454 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:05.454 "is_configured": false, 00:24:05.454 "data_offset": 2048, 00:24:05.454 "data_size": 63488 00:24:05.454 }, 00:24:05.454 { 00:24:05.454 "name": "BaseBdev3", 00:24:05.454 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:05.454 "is_configured": true, 00:24:05.454 "data_offset": 2048, 00:24:05.454 "data_size": 63488 00:24:05.454 }, 00:24:05.454 { 00:24:05.454 "name": "BaseBdev4", 00:24:05.454 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:05.454 "is_configured": true, 00:24:05.454 "data_offset": 2048, 00:24:05.454 "data_size": 63488 00:24:05.454 } 00:24:05.454 ] 00:24:05.454 }' 00:24:05.454 16:03:10 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:05.454 16:03:10 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:06.022 16:03:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:06.281 [2024-06-10 16:03:11.591394] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.281 [2024-06-10 16:03:11.591542] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:06.281 [2024-06-10 16:03:11.591557] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:06.281 [2024-06-10 16:03:11.591582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:06.281 [2024-06-10 16:03:11.595397] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1fcb9f0 00:24:06.281 [2024-06-10 16:03:11.597645] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:06.281 16:03:11 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.216 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:07.475 "name": "raid_bdev1", 00:24:07.475 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:07.475 "strip_size_kb": 0, 00:24:07.475 "state": "online", 00:24:07.475 "raid_level": "raid1", 00:24:07.475 "superblock": true, 00:24:07.475 "num_base_bdevs": 4, 00:24:07.475 "num_base_bdevs_discovered": 3, 00:24:07.475 "num_base_bdevs_operational": 3, 00:24:07.475 "process": { 00:24:07.475 "type": "rebuild", 00:24:07.475 "target": "spare", 00:24:07.475 "progress": { 00:24:07.475 "blocks": 24576, 00:24:07.475 "percent": 38 00:24:07.475 } 00:24:07.475 }, 00:24:07.475 "base_bdevs_list": [ 00:24:07.475 { 00:24:07.475 "name": "spare", 00:24:07.475 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:07.475 "is_configured": true, 00:24:07.475 "data_offset": 2048, 00:24:07.475 "data_size": 63488 00:24:07.475 }, 00:24:07.475 { 00:24:07.475 "name": null, 00:24:07.475 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.475 "is_configured": false, 00:24:07.475 "data_offset": 2048, 00:24:07.475 "data_size": 63488 00:24:07.475 }, 00:24:07.475 { 00:24:07.475 "name": "BaseBdev3", 00:24:07.475 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:07.475 "is_configured": true, 00:24:07.475 "data_offset": 2048, 00:24:07.475 "data_size": 63488 00:24:07.475 }, 00:24:07.475 { 00:24:07.475 "name": "BaseBdev4", 00:24:07.475 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:07.475 "is_configured": true, 00:24:07.475 "data_offset": 2048, 00:24:07.475 "data_size": 63488 00:24:07.475 } 00:24:07.475 ] 00:24:07.475 }' 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:07.475 16:03:12 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:07.735 [2024-06-10 16:03:13.197405] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:07.735 [2024-06-10 16:03:13.209830] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:07.735 [2024-06-10 16:03:13.209871] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:07.735 [2024-06-10 16:03:13.209886] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:07.735 [2024-06-10 16:03:13.209892] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:07.735 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:07.994 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:07.994 "name": "raid_bdev1", 00:24:07.994 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:07.994 "strip_size_kb": 0, 00:24:07.994 "state": "online", 00:24:07.994 "raid_level": "raid1", 00:24:07.994 "superblock": true, 00:24:07.994 "num_base_bdevs": 4, 00:24:07.994 "num_base_bdevs_discovered": 2, 00:24:07.994 "num_base_bdevs_operational": 2, 00:24:07.994 "base_bdevs_list": [ 00:24:07.994 { 00:24:07.994 "name": null, 00:24:07.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.994 "is_configured": false, 00:24:07.994 "data_offset": 2048, 00:24:07.994 "data_size": 63488 00:24:07.994 }, 00:24:07.994 { 00:24:07.994 "name": null, 00:24:07.994 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:07.994 "is_configured": false, 00:24:07.994 "data_offset": 2048, 00:24:07.994 "data_size": 63488 00:24:07.994 }, 00:24:07.994 { 00:24:07.994 "name": "BaseBdev3", 00:24:07.994 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:07.994 "is_configured": true, 00:24:07.994 "data_offset": 2048, 00:24:07.994 "data_size": 63488 00:24:07.994 }, 00:24:07.994 { 00:24:07.994 "name": "BaseBdev4", 00:24:07.994 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:07.994 "is_configured": true, 00:24:07.994 "data_offset": 2048, 00:24:07.994 "data_size": 63488 00:24:07.994 } 00:24:07.994 ] 00:24:07.994 }' 00:24:07.994 16:03:13 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:07.994 16:03:13 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:08.930 16:03:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:08.930 [2024-06-10 16:03:14.360905] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:08.930 [2024-06-10 16:03:14.360949] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:08.930 [2024-06-10 16:03:14.360974] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e963f0 00:24:08.930 [2024-06-10 16:03:14.360984] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:08.930 [2024-06-10 16:03:14.361364] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:08.930 [2024-06-10 16:03:14.361381] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:08.930 [2024-06-10 16:03:14.361460] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:08.930 [2024-06-10 16:03:14.361470] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:08.930 [2024-06-10 16:03:14.361478] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:08.930 [2024-06-10 16:03:14.361494] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:08.930 [2024-06-10 16:03:14.365330] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x2045b20 00:24:08.930 spare 00:24:08.930 [2024-06-10 16:03:14.366794] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:08.930 16:03:14 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@762 -- # sleep 1 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:10.308 "name": "raid_bdev1", 00:24:10.308 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:10.308 "strip_size_kb": 0, 00:24:10.308 "state": "online", 00:24:10.308 "raid_level": "raid1", 00:24:10.308 "superblock": true, 00:24:10.308 "num_base_bdevs": 4, 00:24:10.308 "num_base_bdevs_discovered": 3, 00:24:10.308 "num_base_bdevs_operational": 3, 00:24:10.308 "process": { 00:24:10.308 "type": "rebuild", 00:24:10.308 "target": "spare", 00:24:10.308 "progress": { 00:24:10.308 "blocks": 24576, 00:24:10.308 "percent": 38 00:24:10.308 } 00:24:10.308 }, 00:24:10.308 "base_bdevs_list": [ 00:24:10.308 { 00:24:10.308 "name": "spare", 00:24:10.308 "uuid": "a44e35d6-3fd3-5d8a-92c5-dcb3dae34088", 00:24:10.308 "is_configured": true, 00:24:10.308 "data_offset": 2048, 00:24:10.308 "data_size": 63488 00:24:10.308 }, 00:24:10.308 { 00:24:10.308 "name": null, 00:24:10.308 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.308 "is_configured": false, 00:24:10.308 "data_offset": 2048, 00:24:10.308 "data_size": 63488 00:24:10.308 }, 00:24:10.308 { 00:24:10.308 "name": "BaseBdev3", 00:24:10.308 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:10.308 "is_configured": true, 00:24:10.308 "data_offset": 2048, 00:24:10.308 "data_size": 63488 00:24:10.308 }, 00:24:10.308 { 00:24:10.308 "name": "BaseBdev4", 00:24:10.308 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:10.308 "is_configured": true, 00:24:10.308 "data_offset": 2048, 00:24:10.308 "data_size": 63488 00:24:10.308 } 00:24:10.308 ] 00:24:10.308 }' 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:10.308 16:03:15 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:10.567 [2024-06-10 16:03:15.970649] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:10.567 [2024-06-10 16:03:15.979075] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:10.567 [2024-06-10 16:03:15.979113] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:10.567 [2024-06-10 16:03:15.979128] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:10.567 [2024-06-10 16:03:15.979134] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:10.567 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:10.826 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:10.826 "name": "raid_bdev1", 00:24:10.826 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:10.826 "strip_size_kb": 0, 00:24:10.826 "state": "online", 00:24:10.826 "raid_level": "raid1", 00:24:10.826 "superblock": true, 00:24:10.826 "num_base_bdevs": 4, 00:24:10.826 "num_base_bdevs_discovered": 2, 00:24:10.826 "num_base_bdevs_operational": 2, 00:24:10.826 "base_bdevs_list": [ 00:24:10.826 { 00:24:10.826 "name": null, 00:24:10.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.826 "is_configured": false, 00:24:10.826 "data_offset": 2048, 00:24:10.826 "data_size": 63488 00:24:10.826 }, 00:24:10.826 { 00:24:10.826 "name": null, 00:24:10.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:10.826 "is_configured": false, 00:24:10.826 "data_offset": 2048, 00:24:10.826 "data_size": 63488 00:24:10.826 }, 00:24:10.826 { 00:24:10.826 "name": "BaseBdev3", 00:24:10.826 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:10.826 "is_configured": true, 00:24:10.826 "data_offset": 2048, 00:24:10.826 "data_size": 63488 00:24:10.826 }, 00:24:10.826 { 00:24:10.826 "name": "BaseBdev4", 00:24:10.826 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:10.826 "is_configured": true, 00:24:10.826 "data_offset": 2048, 00:24:10.826 "data_size": 63488 00:24:10.826 } 00:24:10.826 ] 00:24:10.826 }' 00:24:10.826 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:10.826 16:03:16 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:11.393 16:03:16 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:11.652 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:11.652 "name": "raid_bdev1", 00:24:11.652 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:11.652 "strip_size_kb": 0, 00:24:11.652 "state": "online", 00:24:11.652 "raid_level": "raid1", 00:24:11.652 "superblock": true, 00:24:11.652 "num_base_bdevs": 4, 00:24:11.652 "num_base_bdevs_discovered": 2, 00:24:11.652 "num_base_bdevs_operational": 2, 00:24:11.652 "base_bdevs_list": [ 00:24:11.652 { 00:24:11.652 "name": null, 00:24:11.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.652 "is_configured": false, 00:24:11.652 "data_offset": 2048, 00:24:11.652 "data_size": 63488 00:24:11.652 }, 00:24:11.652 { 00:24:11.652 "name": null, 00:24:11.652 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:11.652 "is_configured": false, 00:24:11.652 "data_offset": 2048, 00:24:11.652 "data_size": 63488 00:24:11.652 }, 00:24:11.652 { 00:24:11.652 "name": "BaseBdev3", 00:24:11.652 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:11.652 "is_configured": true, 00:24:11.652 "data_offset": 2048, 00:24:11.652 "data_size": 63488 00:24:11.653 }, 00:24:11.653 { 00:24:11.653 "name": "BaseBdev4", 00:24:11.653 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:11.653 "is_configured": true, 00:24:11.653 "data_offset": 2048, 00:24:11.653 "data_size": 63488 00:24:11.653 } 00:24:11.653 ] 00:24:11.653 }' 00:24:11.653 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:11.911 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:11.911 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:11.911 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:11.911 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:24:12.170 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:12.429 [2024-06-10 16:03:17.727640] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:12.429 [2024-06-10 16:03:17.727681] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:12.429 [2024-06-10 16:03:17.727699] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1e8d8b0 00:24:12.429 [2024-06-10 16:03:17.727709] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:12.429 [2024-06-10 16:03:17.728065] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:12.429 [2024-06-10 16:03:17.728083] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:12.429 [2024-06-10 16:03:17.728145] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:24:12.429 [2024-06-10 16:03:17.728155] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:12.429 [2024-06-10 16:03:17.728163] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:12.429 BaseBdev1 00:24:12.429 16:03:17 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@773 -- # sleep 1 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:13.364 16:03:18 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:13.623 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:13.623 "name": "raid_bdev1", 00:24:13.623 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:13.623 "strip_size_kb": 0, 00:24:13.623 "state": "online", 00:24:13.623 "raid_level": "raid1", 00:24:13.623 "superblock": true, 00:24:13.623 "num_base_bdevs": 4, 00:24:13.623 "num_base_bdevs_discovered": 2, 00:24:13.623 "num_base_bdevs_operational": 2, 00:24:13.623 "base_bdevs_list": [ 00:24:13.623 { 00:24:13.623 "name": null, 00:24:13.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.623 "is_configured": false, 00:24:13.623 "data_offset": 2048, 00:24:13.623 "data_size": 63488 00:24:13.623 }, 00:24:13.623 { 00:24:13.623 "name": null, 00:24:13.623 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:13.623 "is_configured": false, 00:24:13.623 "data_offset": 2048, 00:24:13.623 "data_size": 63488 00:24:13.623 }, 00:24:13.623 { 00:24:13.623 "name": "BaseBdev3", 00:24:13.623 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:13.623 "is_configured": true, 00:24:13.623 "data_offset": 2048, 00:24:13.623 "data_size": 63488 00:24:13.623 }, 00:24:13.623 { 00:24:13.623 "name": "BaseBdev4", 00:24:13.623 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:13.623 "is_configured": true, 00:24:13.623 "data_offset": 2048, 00:24:13.623 "data_size": 63488 00:24:13.623 } 00:24:13.623 ] 00:24:13.623 }' 00:24:13.623 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:13.623 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:14.190 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:14.449 "name": "raid_bdev1", 00:24:14.449 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:14.449 "strip_size_kb": 0, 00:24:14.449 "state": "online", 00:24:14.449 "raid_level": "raid1", 00:24:14.449 "superblock": true, 00:24:14.449 "num_base_bdevs": 4, 00:24:14.449 "num_base_bdevs_discovered": 2, 00:24:14.449 "num_base_bdevs_operational": 2, 00:24:14.449 "base_bdevs_list": [ 00:24:14.449 { 00:24:14.449 "name": null, 00:24:14.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:14.449 "is_configured": false, 00:24:14.449 "data_offset": 2048, 00:24:14.449 "data_size": 63488 00:24:14.449 }, 00:24:14.449 { 00:24:14.449 "name": null, 00:24:14.449 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:14.449 "is_configured": false, 00:24:14.449 "data_offset": 2048, 00:24:14.449 "data_size": 63488 00:24:14.449 }, 00:24:14.449 { 00:24:14.449 "name": "BaseBdev3", 00:24:14.449 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:14.449 "is_configured": true, 00:24:14.449 "data_offset": 2048, 00:24:14.449 "data_size": 63488 00:24:14.449 }, 00:24:14.449 { 00:24:14.449 "name": "BaseBdev4", 00:24:14.449 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:14.449 "is_configured": true, 00:24:14.449 "data_offset": 2048, 00:24:14.449 "data_size": 63488 00:24:14.449 } 00:24:14.449 ] 00:24:14.449 }' 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@649 -- # local es=0 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:24:14.449 16:03:19 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:24:14.708 [2024-06-10 16:03:20.150167] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:14.708 [2024-06-10 16:03:20.150284] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:24:14.708 [2024-06-10 16:03:20.150297] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:24:14.708 request: 00:24:14.708 { 00:24:14.708 "raid_bdev": "raid_bdev1", 00:24:14.708 "base_bdev": "BaseBdev1", 00:24:14.708 "method": "bdev_raid_add_base_bdev", 00:24:14.708 "req_id": 1 00:24:14.708 } 00:24:14.708 Got JSON-RPC error response 00:24:14.708 response: 00:24:14.708 { 00:24:14.708 "code": -22, 00:24:14.708 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:24:14.708 } 00:24:14.708 16:03:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@652 -- # es=1 00:24:14.708 16:03:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:24:14.708 16:03:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:24:14.708 16:03:20 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:24:14.708 16:03:20 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@777 -- # sleep 1 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.086 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:16.086 "name": "raid_bdev1", 00:24:16.086 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:16.086 "strip_size_kb": 0, 00:24:16.086 "state": "online", 00:24:16.086 "raid_level": "raid1", 00:24:16.086 "superblock": true, 00:24:16.086 "num_base_bdevs": 4, 00:24:16.086 "num_base_bdevs_discovered": 2, 00:24:16.086 "num_base_bdevs_operational": 2, 00:24:16.087 "base_bdevs_list": [ 00:24:16.087 { 00:24:16.087 "name": null, 00:24:16.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.087 "is_configured": false, 00:24:16.087 "data_offset": 2048, 00:24:16.087 "data_size": 63488 00:24:16.087 }, 00:24:16.087 { 00:24:16.087 "name": null, 00:24:16.087 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.087 "is_configured": false, 00:24:16.087 "data_offset": 2048, 00:24:16.087 "data_size": 63488 00:24:16.087 }, 00:24:16.087 { 00:24:16.087 "name": "BaseBdev3", 00:24:16.087 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:16.087 "is_configured": true, 00:24:16.087 "data_offset": 2048, 00:24:16.087 "data_size": 63488 00:24:16.087 }, 00:24:16.087 { 00:24:16.087 "name": "BaseBdev4", 00:24:16.087 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:16.087 "is_configured": true, 00:24:16.087 "data_offset": 2048, 00:24:16.087 "data_size": 63488 00:24:16.087 } 00:24:16.087 ] 00:24:16.087 }' 00:24:16.087 16:03:21 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:16.087 16:03:21 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:16.654 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:16.913 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:16.913 "name": "raid_bdev1", 00:24:16.913 "uuid": "22a74ba5-9601-470b-8547-ef6af98aa444", 00:24:16.913 "strip_size_kb": 0, 00:24:16.913 "state": "online", 00:24:16.913 "raid_level": "raid1", 00:24:16.913 "superblock": true, 00:24:16.913 "num_base_bdevs": 4, 00:24:16.913 "num_base_bdevs_discovered": 2, 00:24:16.913 "num_base_bdevs_operational": 2, 00:24:16.913 "base_bdevs_list": [ 00:24:16.913 { 00:24:16.913 "name": null, 00:24:16.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.913 "is_configured": false, 00:24:16.913 "data_offset": 2048, 00:24:16.913 "data_size": 63488 00:24:16.913 }, 00:24:16.913 { 00:24:16.913 "name": null, 00:24:16.913 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:16.913 "is_configured": false, 00:24:16.913 "data_offset": 2048, 00:24:16.913 "data_size": 63488 00:24:16.913 }, 00:24:16.913 { 00:24:16.913 "name": "BaseBdev3", 00:24:16.913 "uuid": "e3b93a7d-0c59-508b-be06-1ebbb04031fc", 00:24:16.913 "is_configured": true, 00:24:16.913 "data_offset": 2048, 00:24:16.913 "data_size": 63488 00:24:16.913 }, 00:24:16.913 { 00:24:16.913 "name": "BaseBdev4", 00:24:16.913 "uuid": "e588267b-af71-5976-82fa-04c32c467d83", 00:24:16.913 "is_configured": true, 00:24:16.913 "data_offset": 2048, 00:24:16.913 "data_size": 63488 00:24:16.913 } 00:24:16.913 ] 00:24:16.913 }' 00:24:16.913 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:16.913 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:16.913 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@782 -- # killprocess 2785932 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@949 -- # '[' -z 2785932 ']' 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@953 -- # kill -0 2785932 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # uname 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2785932 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2785932' 00:24:17.172 killing process with pid 2785932 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@968 -- # kill 2785932 00:24:17.172 Received shutdown signal, test time was about 60.000000 seconds 00:24:17.172 00:24:17.172 Latency(us) 00:24:17.172 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:17.172 =================================================================================================================== 00:24:17.172 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:24:17.172 [2024-06-10 16:03:22.477906] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:17.172 [2024-06-10 16:03:22.478005] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:17.172 [2024-06-10 16:03:22.478063] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:17.172 [2024-06-10 16:03:22.478073] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1fcc8f0 name raid_bdev1, state offline 00:24:17.172 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@973 -- # wait 2785932 00:24:17.172 [2024-06-10 16:03:22.521032] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_sb -- bdev/bdev_raid.sh@784 -- # return 0 00:24:17.431 00:24:17.431 real 0m38.178s 00:24:17.431 user 0m56.841s 00:24:17.431 sys 0m5.548s 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_sb -- common/autotest_common.sh@10 -- # set +x 00:24:17.431 ************************************ 00:24:17.431 END TEST raid_rebuild_test_sb 00:24:17.431 ************************************ 00:24:17.431 16:03:22 bdev_raid -- bdev/bdev_raid.sh@879 -- # run_test raid_rebuild_test_io raid_rebuild_test raid1 4 false true true 00:24:17.431 16:03:22 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:17.431 16:03:22 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:17.431 16:03:22 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:17.431 ************************************ 00:24:17.431 START TEST raid_rebuild_test_io 00:24:17.431 ************************************ 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 false true true 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@570 -- # local superblock=false 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@591 -- # '[' false = true ']' 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2793009 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2793009 /var/tmp/spdk-raid.sock 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@830 -- # '[' -z 2793009 ']' 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:17.431 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:17.431 16:03:22 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:17.431 [2024-06-10 16:03:22.858491] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:24:17.431 [2024-06-10 16:03:22.858546] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2793009 ] 00:24:17.431 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:17.431 Zero copy mechanism will not be used. 00:24:17.689 [2024-06-10 16:03:22.957835] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:17.689 [2024-06-10 16:03:23.052873] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:17.689 [2024-06-10 16:03:23.117608] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:17.689 [2024-06-10 16:03:23.117636] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:18.626 16:03:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:18.626 16:03:23 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@863 -- # return 0 00:24:18.626 16:03:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:18.626 16:03:23 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:18.626 BaseBdev1_malloc 00:24:18.626 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:18.921 [2024-06-10 16:03:24.286887] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:18.921 [2024-06-10 16:03:24.286930] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:18.921 [2024-06-10 16:03:24.286949] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc1e90 00:24:18.921 [2024-06-10 16:03:24.286966] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:18.921 [2024-06-10 16:03:24.288668] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:18.921 [2024-06-10 16:03:24.288695] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:18.921 BaseBdev1 00:24:18.921 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:18.921 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:19.198 BaseBdev2_malloc 00:24:19.198 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:19.457 [2024-06-10 16:03:24.809340] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:19.457 [2024-06-10 16:03:24.809382] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.457 [2024-06-10 16:03:24.809403] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1bc29e0 00:24:19.457 [2024-06-10 16:03:24.809412] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.457 [2024-06-10 16:03:24.810973] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.457 [2024-06-10 16:03:24.810998] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:19.457 BaseBdev2 00:24:19.457 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:19.457 16:03:24 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:19.717 BaseBdev3_malloc 00:24:19.717 16:03:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:19.976 [2024-06-10 16:03:25.315335] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:19.976 [2024-06-10 16:03:25.315377] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:19.976 [2024-06-10 16:03:25.315394] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6ee70 00:24:19.976 [2024-06-10 16:03:25.315403] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:19.976 [2024-06-10 16:03:25.316962] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:19.976 [2024-06-10 16:03:25.316988] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:19.976 BaseBdev3 00:24:19.976 16:03:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:19.976 16:03:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:20.235 BaseBdev4_malloc 00:24:20.235 16:03:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:20.494 [2024-06-10 16:03:25.829319] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:20.494 [2024-06-10 16:03:25.829359] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:20.494 [2024-06-10 16:03:25.829376] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d6d700 00:24:20.494 [2024-06-10 16:03:25.829385] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:20.494 [2024-06-10 16:03:25.830924] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:20.494 [2024-06-10 16:03:25.830952] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:20.494 BaseBdev4 00:24:20.494 16:03:25 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:20.752 spare_malloc 00:24:20.752 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:21.010 spare_delay 00:24:21.010 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:21.269 [2024-06-10 16:03:26.583792] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:21.269 [2024-06-10 16:03:26.583832] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:21.269 [2024-06-10 16:03:26.583850] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1d73710 00:24:21.269 [2024-06-10 16:03:26.583860] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:21.269 [2024-06-10 16:03:26.585461] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:21.269 [2024-06-10 16:03:26.585488] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:21.269 spare 00:24:21.269 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:21.528 [2024-06-10 16:03:26.836483] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:21.528 [2024-06-10 16:03:26.837813] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:21.528 [2024-06-10 16:03:26.837870] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:21.528 [2024-06-10 16:03:26.837917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:21.528 [2024-06-10 16:03:26.838004] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1cf3260 00:24:21.528 [2024-06-10 16:03:26.838012] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 65536, blocklen 512 00:24:21.528 [2024-06-10 16:03:26.838228] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf6610 00:24:21.528 [2024-06-10 16:03:26.838384] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1cf3260 00:24:21.528 [2024-06-10 16:03:26.838393] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1cf3260 00:24:21.528 [2024-06-10 16:03:26.838508] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:21.528 16:03:26 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:21.787 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:21.787 "name": "raid_bdev1", 00:24:21.787 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:21.787 "strip_size_kb": 0, 00:24:21.787 "state": "online", 00:24:21.787 "raid_level": "raid1", 00:24:21.787 "superblock": false, 00:24:21.787 "num_base_bdevs": 4, 00:24:21.787 "num_base_bdevs_discovered": 4, 00:24:21.787 "num_base_bdevs_operational": 4, 00:24:21.787 "base_bdevs_list": [ 00:24:21.787 { 00:24:21.787 "name": "BaseBdev1", 00:24:21.787 "uuid": "0b6bd429-1d11-55e2-bcf7-6f54f8de50a5", 00:24:21.787 "is_configured": true, 00:24:21.787 "data_offset": 0, 00:24:21.787 "data_size": 65536 00:24:21.787 }, 00:24:21.787 { 00:24:21.787 "name": "BaseBdev2", 00:24:21.787 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:21.787 "is_configured": true, 00:24:21.787 "data_offset": 0, 00:24:21.787 "data_size": 65536 00:24:21.787 }, 00:24:21.787 { 00:24:21.787 "name": "BaseBdev3", 00:24:21.787 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:21.787 "is_configured": true, 00:24:21.787 "data_offset": 0, 00:24:21.787 "data_size": 65536 00:24:21.787 }, 00:24:21.787 { 00:24:21.787 "name": "BaseBdev4", 00:24:21.787 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:21.787 "is_configured": true, 00:24:21.787 "data_offset": 0, 00:24:21.787 "data_size": 65536 00:24:21.787 } 00:24:21.787 ] 00:24:21.787 }' 00:24:21.787 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:21.787 16:03:27 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:22.355 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:22.355 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:22.614 [2024-06-10 16:03:27.967800] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:22.614 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=65536 00:24:22.614 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:22.614 16:03:27 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:22.873 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@618 -- # data_offset=0 00:24:22.873 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:22.873 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:22.873 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:22.873 [2024-06-10 16:03:28.370585] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf8c70 00:24:22.873 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:22.873 Zero copy mechanism will not be used. 00:24:22.873 Running I/O for 60 seconds... 00:24:23.133 [2024-06-10 16:03:28.490633] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:23.133 [2024-06-10 16:03:28.499557] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cf8c70 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:23.133 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:23.393 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:23.393 "name": "raid_bdev1", 00:24:23.393 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:23.393 "strip_size_kb": 0, 00:24:23.393 "state": "online", 00:24:23.393 "raid_level": "raid1", 00:24:23.393 "superblock": false, 00:24:23.393 "num_base_bdevs": 4, 00:24:23.393 "num_base_bdevs_discovered": 3, 00:24:23.393 "num_base_bdevs_operational": 3, 00:24:23.393 "base_bdevs_list": [ 00:24:23.393 { 00:24:23.393 "name": null, 00:24:23.393 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:23.393 "is_configured": false, 00:24:23.393 "data_offset": 0, 00:24:23.393 "data_size": 65536 00:24:23.393 }, 00:24:23.393 { 00:24:23.393 "name": "BaseBdev2", 00:24:23.393 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:23.393 "is_configured": true, 00:24:23.393 "data_offset": 0, 00:24:23.393 "data_size": 65536 00:24:23.393 }, 00:24:23.393 { 00:24:23.393 "name": "BaseBdev3", 00:24:23.393 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:23.393 "is_configured": true, 00:24:23.393 "data_offset": 0, 00:24:23.393 "data_size": 65536 00:24:23.393 }, 00:24:23.393 { 00:24:23.393 "name": "BaseBdev4", 00:24:23.393 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:23.393 "is_configured": true, 00:24:23.393 "data_offset": 0, 00:24:23.393 "data_size": 65536 00:24:23.393 } 00:24:23.393 ] 00:24:23.393 }' 00:24:23.393 16:03:28 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:23.393 16:03:28 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:23.961 16:03:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:24.220 [2024-06-10 16:03:29.617095] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:24.220 16:03:29 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:24.220 [2024-06-10 16:03:29.695034] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1cf6730 00:24:24.220 [2024-06-10 16:03:29.697425] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:24.479 [2024-06-10 16:03:29.809847] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:24.479 [2024-06-10 16:03:29.811013] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:24.738 [2024-06-10 16:03:30.046528] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:24.738 [2024-06-10 16:03:30.047126] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:24.997 [2024-06-10 16:03:30.395308] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:24.997 [2024-06-10 16:03:30.498170] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:25.256 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:25.515 [2024-06-10 16:03:30.833736] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:25.515 [2024-06-10 16:03:30.834054] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:25.516 "name": "raid_bdev1", 00:24:25.516 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:25.516 "strip_size_kb": 0, 00:24:25.516 "state": "online", 00:24:25.516 "raid_level": "raid1", 00:24:25.516 "superblock": false, 00:24:25.516 "num_base_bdevs": 4, 00:24:25.516 "num_base_bdevs_discovered": 4, 00:24:25.516 "num_base_bdevs_operational": 4, 00:24:25.516 "process": { 00:24:25.516 "type": "rebuild", 00:24:25.516 "target": "spare", 00:24:25.516 "progress": { 00:24:25.516 "blocks": 14336, 00:24:25.516 "percent": 21 00:24:25.516 } 00:24:25.516 }, 00:24:25.516 "base_bdevs_list": [ 00:24:25.516 { 00:24:25.516 "name": "spare", 00:24:25.516 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:25.516 "is_configured": true, 00:24:25.516 "data_offset": 0, 00:24:25.516 "data_size": 65536 00:24:25.516 }, 00:24:25.516 { 00:24:25.516 "name": "BaseBdev2", 00:24:25.516 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:25.516 "is_configured": true, 00:24:25.516 "data_offset": 0, 00:24:25.516 "data_size": 65536 00:24:25.516 }, 00:24:25.516 { 00:24:25.516 "name": "BaseBdev3", 00:24:25.516 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:25.516 "is_configured": true, 00:24:25.516 "data_offset": 0, 00:24:25.516 "data_size": 65536 00:24:25.516 }, 00:24:25.516 { 00:24:25.516 "name": "BaseBdev4", 00:24:25.516 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:25.516 "is_configured": true, 00:24:25.516 "data_offset": 0, 00:24:25.516 "data_size": 65536 00:24:25.516 } 00:24:25.516 ] 00:24:25.516 }' 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:25.516 [2024-06-10 16:03:30.947363] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:25.516 16:03:30 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:25.775 [2024-06-10 16:03:31.224063] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:26.035 [2024-06-10 16:03:31.423346] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:26.035 [2024-06-10 16:03:31.434996] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:26.035 [2024-06-10 16:03:31.435025] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:26.035 [2024-06-10 16:03:31.435033] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:26.035 [2024-06-10 16:03:31.469160] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1cf8c70 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:26.035 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:26.294 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:26.294 "name": "raid_bdev1", 00:24:26.294 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:26.294 "strip_size_kb": 0, 00:24:26.294 "state": "online", 00:24:26.294 "raid_level": "raid1", 00:24:26.294 "superblock": false, 00:24:26.294 "num_base_bdevs": 4, 00:24:26.294 "num_base_bdevs_discovered": 3, 00:24:26.294 "num_base_bdevs_operational": 3, 00:24:26.294 "base_bdevs_list": [ 00:24:26.294 { 00:24:26.294 "name": null, 00:24:26.294 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:26.294 "is_configured": false, 00:24:26.294 "data_offset": 0, 00:24:26.294 "data_size": 65536 00:24:26.294 }, 00:24:26.294 { 00:24:26.294 "name": "BaseBdev2", 00:24:26.294 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:26.294 "is_configured": true, 00:24:26.294 "data_offset": 0, 00:24:26.294 "data_size": 65536 00:24:26.294 }, 00:24:26.294 { 00:24:26.294 "name": "BaseBdev3", 00:24:26.294 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:26.294 "is_configured": true, 00:24:26.294 "data_offset": 0, 00:24:26.294 "data_size": 65536 00:24:26.294 }, 00:24:26.294 { 00:24:26.294 "name": "BaseBdev4", 00:24:26.294 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:26.294 "is_configured": true, 00:24:26.294 "data_offset": 0, 00:24:26.294 "data_size": 65536 00:24:26.294 } 00:24:26.295 ] 00:24:26.295 }' 00:24:26.295 16:03:31 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:26.295 16:03:31 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:27.228 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:27.229 "name": "raid_bdev1", 00:24:27.229 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:27.229 "strip_size_kb": 0, 00:24:27.229 "state": "online", 00:24:27.229 "raid_level": "raid1", 00:24:27.229 "superblock": false, 00:24:27.229 "num_base_bdevs": 4, 00:24:27.229 "num_base_bdevs_discovered": 3, 00:24:27.229 "num_base_bdevs_operational": 3, 00:24:27.229 "base_bdevs_list": [ 00:24:27.229 { 00:24:27.229 "name": null, 00:24:27.229 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:27.229 "is_configured": false, 00:24:27.229 "data_offset": 0, 00:24:27.229 "data_size": 65536 00:24:27.229 }, 00:24:27.229 { 00:24:27.229 "name": "BaseBdev2", 00:24:27.229 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:27.229 "is_configured": true, 00:24:27.229 "data_offset": 0, 00:24:27.229 "data_size": 65536 00:24:27.229 }, 00:24:27.229 { 00:24:27.229 "name": "BaseBdev3", 00:24:27.229 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:27.229 "is_configured": true, 00:24:27.229 "data_offset": 0, 00:24:27.229 "data_size": 65536 00:24:27.229 }, 00:24:27.229 { 00:24:27.229 "name": "BaseBdev4", 00:24:27.229 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:27.229 "is_configured": true, 00:24:27.229 "data_offset": 0, 00:24:27.229 "data_size": 65536 00:24:27.229 } 00:24:27.229 ] 00:24:27.229 }' 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:27.229 16:03:32 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:27.487 [2024-06-10 16:03:32.967335] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:27.745 16:03:33 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:27.745 [2024-06-10 16:03:33.056226] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1d6aaf0 00:24:27.745 [2024-06-10 16:03:33.057783] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:27.745 [2024-06-10 16:03:33.168395] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:27.745 [2024-06-10 16:03:33.169683] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:28.004 [2024-06-10 16:03:33.398419] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:28.004 [2024-06-10 16:03:33.398696] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:28.262 [2024-06-10 16:03:33.727788] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:28.262 [2024-06-10 16:03:33.728102] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:28.521 [2024-06-10 16:03:33.841001] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:28.779 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:28.780 [2024-06-10 16:03:34.100427] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:28.780 [2024-06-10 16:03:34.100742] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 14336 offset_begin: 12288 offset_end: 18432 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.039 "name": "raid_bdev1", 00:24:29.039 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:29.039 "strip_size_kb": 0, 00:24:29.039 "state": "online", 00:24:29.039 "raid_level": "raid1", 00:24:29.039 "superblock": false, 00:24:29.039 "num_base_bdevs": 4, 00:24:29.039 "num_base_bdevs_discovered": 4, 00:24:29.039 "num_base_bdevs_operational": 4, 00:24:29.039 "process": { 00:24:29.039 "type": "rebuild", 00:24:29.039 "target": "spare", 00:24:29.039 "progress": { 00:24:29.039 "blocks": 14336, 00:24:29.039 "percent": 21 00:24:29.039 } 00:24:29.039 }, 00:24:29.039 "base_bdevs_list": [ 00:24:29.039 { 00:24:29.039 "name": "spare", 00:24:29.039 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:29.039 "is_configured": true, 00:24:29.039 "data_offset": 0, 00:24:29.039 "data_size": 65536 00:24:29.039 }, 00:24:29.039 { 00:24:29.039 "name": "BaseBdev2", 00:24:29.039 "uuid": "c64a5552-6e55-5b4a-bf94-381973bb1475", 00:24:29.039 "is_configured": true, 00:24:29.039 "data_offset": 0, 00:24:29.039 "data_size": 65536 00:24:29.039 }, 00:24:29.039 { 00:24:29.039 "name": "BaseBdev3", 00:24:29.039 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:29.039 "is_configured": true, 00:24:29.039 "data_offset": 0, 00:24:29.039 "data_size": 65536 00:24:29.039 }, 00:24:29.039 { 00:24:29.039 "name": "BaseBdev4", 00:24:29.039 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:29.039 "is_configured": true, 00:24:29.039 "data_offset": 0, 00:24:29.039 "data_size": 65536 00:24:29.039 } 00:24:29.039 ] 00:24:29.039 }' 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.039 [2024-06-10 16:03:34.354354] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@665 -- # '[' false = true ']' 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:29.039 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:29.298 [2024-06-10 16:03:34.670279] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:29.298 [2024-06-10 16:03:34.761602] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:29.298 [2024-06-10 16:03:34.761782] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:29.557 [2024-06-10 16:03:34.864369] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1cf8c70 00:24:29.557 [2024-06-10 16:03:34.864398] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1d6aaf0 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:29.557 16:03:34 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.815 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:29.815 "name": "raid_bdev1", 00:24:29.815 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:29.815 "strip_size_kb": 0, 00:24:29.815 "state": "online", 00:24:29.815 "raid_level": "raid1", 00:24:29.815 "superblock": false, 00:24:29.815 "num_base_bdevs": 4, 00:24:29.815 "num_base_bdevs_discovered": 3, 00:24:29.815 "num_base_bdevs_operational": 3, 00:24:29.815 "process": { 00:24:29.815 "type": "rebuild", 00:24:29.815 "target": "spare", 00:24:29.815 "progress": { 00:24:29.815 "blocks": 26624, 00:24:29.815 "percent": 40 00:24:29.815 } 00:24:29.815 }, 00:24:29.815 "base_bdevs_list": [ 00:24:29.815 { 00:24:29.815 "name": "spare", 00:24:29.815 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:29.816 "is_configured": true, 00:24:29.816 "data_offset": 0, 00:24:29.816 "data_size": 65536 00:24:29.816 }, 00:24:29.816 { 00:24:29.816 "name": null, 00:24:29.816 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:29.816 "is_configured": false, 00:24:29.816 "data_offset": 0, 00:24:29.816 "data_size": 65536 00:24:29.816 }, 00:24:29.816 { 00:24:29.816 "name": "BaseBdev3", 00:24:29.816 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:29.816 "is_configured": true, 00:24:29.816 "data_offset": 0, 00:24:29.816 "data_size": 65536 00:24:29.816 }, 00:24:29.816 { 00:24:29.816 "name": "BaseBdev4", 00:24:29.816 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:29.816 "is_configured": true, 00:24:29.816 "data_offset": 0, 00:24:29.816 "data_size": 65536 00:24:29.816 } 00:24:29.816 ] 00:24:29.816 }' 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:29.816 [2024-06-10 16:03:35.222676] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@705 -- # local timeout=937 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:29.816 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:30.074 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:30.074 "name": "raid_bdev1", 00:24:30.074 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:30.074 "strip_size_kb": 0, 00:24:30.074 "state": "online", 00:24:30.074 "raid_level": "raid1", 00:24:30.074 "superblock": false, 00:24:30.074 "num_base_bdevs": 4, 00:24:30.074 "num_base_bdevs_discovered": 3, 00:24:30.074 "num_base_bdevs_operational": 3, 00:24:30.074 "process": { 00:24:30.074 "type": "rebuild", 00:24:30.074 "target": "spare", 00:24:30.074 "progress": { 00:24:30.074 "blocks": 30720, 00:24:30.074 "percent": 46 00:24:30.074 } 00:24:30.074 }, 00:24:30.074 "base_bdevs_list": [ 00:24:30.074 { 00:24:30.074 "name": "spare", 00:24:30.074 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:30.074 "is_configured": true, 00:24:30.074 "data_offset": 0, 00:24:30.074 "data_size": 65536 00:24:30.074 }, 00:24:30.074 { 00:24:30.074 "name": null, 00:24:30.074 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:30.074 "is_configured": false, 00:24:30.074 "data_offset": 0, 00:24:30.074 "data_size": 65536 00:24:30.074 }, 00:24:30.074 { 00:24:30.074 "name": "BaseBdev3", 00:24:30.074 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:30.074 "is_configured": true, 00:24:30.074 "data_offset": 0, 00:24:30.074 "data_size": 65536 00:24:30.074 }, 00:24:30.074 { 00:24:30.074 "name": "BaseBdev4", 00:24:30.074 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:30.074 "is_configured": true, 00:24:30.074 "data_offset": 0, 00:24:30.074 "data_size": 65536 00:24:30.074 } 00:24:30.074 ] 00:24:30.074 }' 00:24:30.074 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:30.074 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:30.333 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:30.333 [2024-06-10 16:03:35.628892] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 32768 offset_begin: 30720 offset_end: 36864 00:24:30.333 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:30.333 16:03:35 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:30.591 [2024-06-10 16:03:36.093379] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 40960 offset_begin: 36864 offset_end: 43008 00:24:30.849 [2024-06-10 16:03:36.316389] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:31.108 [2024-06-10 16:03:36.540085] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:31.366 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:31.625 "name": "raid_bdev1", 00:24:31.625 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:31.625 "strip_size_kb": 0, 00:24:31.625 "state": "online", 00:24:31.625 "raid_level": "raid1", 00:24:31.625 "superblock": false, 00:24:31.625 "num_base_bdevs": 4, 00:24:31.625 "num_base_bdevs_discovered": 3, 00:24:31.625 "num_base_bdevs_operational": 3, 00:24:31.625 "process": { 00:24:31.625 "type": "rebuild", 00:24:31.625 "target": "spare", 00:24:31.625 "progress": { 00:24:31.625 "blocks": 51200, 00:24:31.625 "percent": 78 00:24:31.625 } 00:24:31.625 }, 00:24:31.625 "base_bdevs_list": [ 00:24:31.625 { 00:24:31.625 "name": "spare", 00:24:31.625 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:31.625 "is_configured": true, 00:24:31.625 "data_offset": 0, 00:24:31.625 "data_size": 65536 00:24:31.625 }, 00:24:31.625 { 00:24:31.625 "name": null, 00:24:31.625 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:31.625 "is_configured": false, 00:24:31.625 "data_offset": 0, 00:24:31.625 "data_size": 65536 00:24:31.625 }, 00:24:31.625 { 00:24:31.625 "name": "BaseBdev3", 00:24:31.625 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:31.625 "is_configured": true, 00:24:31.625 "data_offset": 0, 00:24:31.625 "data_size": 65536 00:24:31.625 }, 00:24:31.625 { 00:24:31.625 "name": "BaseBdev4", 00:24:31.625 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:31.625 "is_configured": true, 00:24:31.625 "data_offset": 0, 00:24:31.625 "data_size": 65536 00:24:31.625 } 00:24:31.625 ] 00:24:31.625 }' 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:31.625 16:03:36 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:31.625 [2024-06-10 16:03:37.105536] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 57344 offset_begin: 55296 offset_end: 61440 00:24:32.192 [2024-06-10 16:03:37.612268] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:32.192 [2024-06-10 16:03:37.663769] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:32.192 [2024-06-10 16:03:37.664950] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:32.758 16:03:37 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.027 "name": "raid_bdev1", 00:24:33.027 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:33.027 "strip_size_kb": 0, 00:24:33.027 "state": "online", 00:24:33.027 "raid_level": "raid1", 00:24:33.027 "superblock": false, 00:24:33.027 "num_base_bdevs": 4, 00:24:33.027 "num_base_bdevs_discovered": 3, 00:24:33.027 "num_base_bdevs_operational": 3, 00:24:33.027 "base_bdevs_list": [ 00:24:33.027 { 00:24:33.027 "name": "spare", 00:24:33.027 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:33.027 "is_configured": true, 00:24:33.027 "data_offset": 0, 00:24:33.027 "data_size": 65536 00:24:33.027 }, 00:24:33.027 { 00:24:33.027 "name": null, 00:24:33.027 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.027 "is_configured": false, 00:24:33.027 "data_offset": 0, 00:24:33.027 "data_size": 65536 00:24:33.027 }, 00:24:33.027 { 00:24:33.027 "name": "BaseBdev3", 00:24:33.027 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:33.027 "is_configured": true, 00:24:33.027 "data_offset": 0, 00:24:33.027 "data_size": 65536 00:24:33.027 }, 00:24:33.027 { 00:24:33.027 "name": "BaseBdev4", 00:24:33.027 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:33.027 "is_configured": true, 00:24:33.027 "data_offset": 0, 00:24:33.027 "data_size": 65536 00:24:33.027 } 00:24:33.027 ] 00:24:33.027 }' 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@708 -- # break 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.027 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:33.340 "name": "raid_bdev1", 00:24:33.340 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:33.340 "strip_size_kb": 0, 00:24:33.340 "state": "online", 00:24:33.340 "raid_level": "raid1", 00:24:33.340 "superblock": false, 00:24:33.340 "num_base_bdevs": 4, 00:24:33.340 "num_base_bdevs_discovered": 3, 00:24:33.340 "num_base_bdevs_operational": 3, 00:24:33.340 "base_bdevs_list": [ 00:24:33.340 { 00:24:33.340 "name": "spare", 00:24:33.340 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:33.340 "is_configured": true, 00:24:33.340 "data_offset": 0, 00:24:33.340 "data_size": 65536 00:24:33.340 }, 00:24:33.340 { 00:24:33.340 "name": null, 00:24:33.340 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.340 "is_configured": false, 00:24:33.340 "data_offset": 0, 00:24:33.340 "data_size": 65536 00:24:33.340 }, 00:24:33.340 { 00:24:33.340 "name": "BaseBdev3", 00:24:33.340 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:33.340 "is_configured": true, 00:24:33.340 "data_offset": 0, 00:24:33.340 "data_size": 65536 00:24:33.340 }, 00:24:33.340 { 00:24:33.340 "name": "BaseBdev4", 00:24:33.340 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:33.340 "is_configured": true, 00:24:33.340 "data_offset": 0, 00:24:33.340 "data_size": 65536 00:24:33.340 } 00:24:33.340 ] 00:24:33.340 }' 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:33.340 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:33.599 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:33.599 "name": "raid_bdev1", 00:24:33.599 "uuid": "59387a77-2c3b-44b3-a504-7cd8a8010e60", 00:24:33.599 "strip_size_kb": 0, 00:24:33.599 "state": "online", 00:24:33.599 "raid_level": "raid1", 00:24:33.599 "superblock": false, 00:24:33.599 "num_base_bdevs": 4, 00:24:33.599 "num_base_bdevs_discovered": 3, 00:24:33.599 "num_base_bdevs_operational": 3, 00:24:33.599 "base_bdevs_list": [ 00:24:33.599 { 00:24:33.599 "name": "spare", 00:24:33.599 "uuid": "44a0369d-bde5-5de4-bab7-a437bf5df1bd", 00:24:33.599 "is_configured": true, 00:24:33.599 "data_offset": 0, 00:24:33.599 "data_size": 65536 00:24:33.599 }, 00:24:33.599 { 00:24:33.599 "name": null, 00:24:33.599 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:33.599 "is_configured": false, 00:24:33.599 "data_offset": 0, 00:24:33.599 "data_size": 65536 00:24:33.599 }, 00:24:33.599 { 00:24:33.599 "name": "BaseBdev3", 00:24:33.599 "uuid": "e99e2999-7b64-5e2a-87ba-4dad7bfbef43", 00:24:33.599 "is_configured": true, 00:24:33.599 "data_offset": 0, 00:24:33.599 "data_size": 65536 00:24:33.599 }, 00:24:33.599 { 00:24:33.599 "name": "BaseBdev4", 00:24:33.599 "uuid": "9b1a93fe-8fd4-567c-a57b-fcce2467872b", 00:24:33.599 "is_configured": true, 00:24:33.599 "data_offset": 0, 00:24:33.599 "data_size": 65536 00:24:33.599 } 00:24:33.599 ] 00:24:33.599 }' 00:24:33.599 16:03:38 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:33.599 16:03:38 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:34.166 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:34.424 [2024-06-10 16:03:39.754592] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:34.424 [2024-06-10 16:03:39.754623] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:34.424 00:24:34.424 Latency(us) 00:24:34.424 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:34.424 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:34.424 raid_bdev1 : 11.39 94.56 283.67 0.00 0.00 14079.95 300.37 122833.19 00:24:34.424 =================================================================================================================== 00:24:34.424 Total : 94.56 283.67 0.00 0.00 14079.95 300.37 122833.19 00:24:34.424 [2024-06-10 16:03:39.794853] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:34.424 [2024-06-10 16:03:39.794881] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:34.424 [2024-06-10 16:03:39.794990] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:34.424 [2024-06-10 16:03:39.795000] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1cf3260 name raid_bdev1, state offline 00:24:34.424 0 00:24:34.424 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:34.424 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:34.683 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:34.683 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:34.683 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:34.683 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:34.683 16:03:39 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:34.683 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:34.942 /dev/nbd0 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:34.942 1+0 records in 00:24:34.942 1+0 records out 00:24:34.942 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238999 s, 17.1 MB/s 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@727 -- # continue 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:34.942 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:24:35.201 /dev/nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.201 1+0 records in 00:24:35.201 1+0 records out 00:24:35.201 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000241271 s, 17.0 MB/s 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:35.201 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@12 -- # local i 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.460 16:03:40 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:24:35.719 /dev/nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@868 -- # local i 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@872 -- # break 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:35.719 1+0 records in 00:24:35.719 1+0 records out 00:24:35.719 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000235013 s, 17.4 MB/s 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@885 -- # size=4096 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@888 -- # return 0 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@730 -- # cmp -i 0 /dev/nbd0 /dev/nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:35.719 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@51 -- # local i 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:35.978 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@41 -- # break 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@742 -- # '[' false = true ']' 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@782 -- # killprocess 2793009 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@949 -- # '[' -z 2793009 ']' 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@953 -- # kill -0 2793009 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # uname 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2793009 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:24:36.237 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:24:36.238 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2793009' 00:24:36.238 killing process with pid 2793009 00:24:36.238 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@968 -- # kill 2793009 00:24:36.238 Received shutdown signal, test time was about 13.278918 seconds 00:24:36.238 00:24:36.238 Latency(us) 00:24:36.238 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:36.238 =================================================================================================================== 00:24:36.238 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:24:36.238 [2024-06-10 16:03:41.684227] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:24:36.238 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@973 -- # wait 2793009 00:24:36.238 [2024-06-10 16:03:41.721197] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_io -- bdev/bdev_raid.sh@784 -- # return 0 00:24:36.497 00:24:36.497 real 0m19.135s 00:24:36.497 user 0m30.373s 00:24:36.497 sys 0m2.676s 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.497 ************************************ 00:24:36.497 END TEST raid_rebuild_test_io 00:24:36.497 ************************************ 00:24:36.497 16:03:41 bdev_raid -- bdev/bdev_raid.sh@880 -- # run_test raid_rebuild_test_sb_io raid_rebuild_test raid1 4 true true true 00:24:36.497 16:03:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:24:36.497 16:03:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:24:36.497 16:03:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:24:36.497 ************************************ 00:24:36.497 START TEST raid_rebuild_test_sb_io 00:24:36.497 ************************************ 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 4 true true true 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=4 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@571 -- # local background_io=true 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@572 -- # local verify=true 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev3 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # echo BaseBdev4 00:24:36.497 16:03:41 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2' 'BaseBdev3' 'BaseBdev4') 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@575 -- # local strip_size 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@576 -- # local create_arg 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@578 -- # local data_offset 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@596 -- # raid_pid=2796429 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@597 -- # waitforlisten 2796429 /var/tmp/spdk-raid.sock 00:24:36.497 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@830 -- # '[' -z 2796429 ']' 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@835 -- # local max_retries=100 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:24:36.756 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@839 -- # xtrace_disable 00:24:36.756 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:36.756 [2024-06-10 16:03:42.061386] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:24:36.757 [2024-06-10 16:03:42.061440] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2796429 ] 00:24:36.757 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:36.757 Zero copy mechanism will not be used. 00:24:36.757 [2024-06-10 16:03:42.155123] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:24:36.757 [2024-06-10 16:03:42.249234] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:24:37.016 [2024-06-10 16:03:42.315754] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.016 [2024-06-10 16:03:42.315787] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:24:37.584 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:24:37.584 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@863 -- # return 0 00:24:37.584 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:37.584 16:03:42 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev1_malloc 00:24:37.843 BaseBdev1_malloc 00:24:37.843 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:24:37.843 [2024-06-10 16:03:43.272433] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:24:37.843 [2024-06-10 16:03:43.272480] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:37.843 [2024-06-10 16:03:43.272498] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1271e90 00:24:37.843 [2024-06-10 16:03:43.272507] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:37.843 [2024-06-10 16:03:43.274132] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:37.843 [2024-06-10 16:03:43.274162] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:24:37.843 BaseBdev1 00:24:37.844 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:37.844 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev2_malloc 00:24:38.102 BaseBdev2_malloc 00:24:38.102 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:24:38.102 [2024-06-10 16:03:43.609960] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:24:38.102 [2024-06-10 16:03:43.610001] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.102 [2024-06-10 16:03:43.610019] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x12729e0 00:24:38.102 [2024-06-10 16:03:43.610028] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.102 [2024-06-10 16:03:43.611468] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.102 [2024-06-10 16:03:43.611495] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:24:38.360 BaseBdev2 00:24:38.360 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.360 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev3_malloc 00:24:38.360 BaseBdev3_malloc 00:24:38.360 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev3_malloc -p BaseBdev3 00:24:38.619 [2024-06-10 16:03:43.947093] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev3_malloc 00:24:38.619 [2024-06-10 16:03:43.947131] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.619 [2024-06-10 16:03:43.947146] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141ee70 00:24:38.619 [2024-06-10 16:03:43.947155] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.619 [2024-06-10 16:03:43.948601] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.619 [2024-06-10 16:03:43.948627] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev3 00:24:38.619 BaseBdev3 00:24:38.620 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:24:38.620 16:03:43 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b BaseBdev4_malloc 00:24:38.620 BaseBdev4_malloc 00:24:38.878 16:03:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev4_malloc -p BaseBdev4 00:24:38.878 [2024-06-10 16:03:44.284242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev4_malloc 00:24:38.878 [2024-06-10 16:03:44.284280] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:38.878 [2024-06-10 16:03:44.284295] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141d700 00:24:38.878 [2024-06-10 16:03:44.284304] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:38.878 [2024-06-10 16:03:44.285740] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:38.878 [2024-06-10 16:03:44.285765] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev4 00:24:38.878 BaseBdev4 00:24:38.878 16:03:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 512 -b spare_malloc 00:24:39.137 spare_malloc 00:24:39.137 16:03:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:24:39.396 spare_delay 00:24:39.396 16:03:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:39.654 [2024-06-10 16:03:44.970434] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:39.654 [2024-06-10 16:03:44.970472] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:39.654 [2024-06-10 16:03:44.970488] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1423710 00:24:39.654 [2024-06-10 16:03:44.970498] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:39.654 [2024-06-10 16:03:44.971978] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:39.654 [2024-06-10 16:03:44.972003] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:39.654 spare 00:24:39.654 16:03:44 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2 BaseBdev3 BaseBdev4' -n raid_bdev1 00:24:39.654 [2024-06-10 16:03:45.134905] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:24:39.654 [2024-06-10 16:03:45.136126] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:24:39.654 [2024-06-10 16:03:45.136188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:39.654 [2024-06-10 16:03:45.136233] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:39.654 [2024-06-10 16:03:45.136420] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x13a3260 00:24:39.654 [2024-06-10 16:03:45.136430] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:39.654 [2024-06-10 16:03:45.136615] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13a3200 00:24:39.654 [2024-06-10 16:03:45.136763] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x13a3260 00:24:39.654 [2024-06-10 16:03:45.136771] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x13a3260 00:24:39.654 [2024-06-10 16:03:45.136864] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 4 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=4 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:39.655 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:39.913 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:39.913 "name": "raid_bdev1", 00:24:39.913 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:39.913 "strip_size_kb": 0, 00:24:39.913 "state": "online", 00:24:39.913 "raid_level": "raid1", 00:24:39.913 "superblock": true, 00:24:39.913 "num_base_bdevs": 4, 00:24:39.913 "num_base_bdevs_discovered": 4, 00:24:39.913 "num_base_bdevs_operational": 4, 00:24:39.913 "base_bdevs_list": [ 00:24:39.913 { 00:24:39.913 "name": "BaseBdev1", 00:24:39.913 "uuid": "837328d7-0230-5770-86da-6de9b603b618", 00:24:39.913 "is_configured": true, 00:24:39.913 "data_offset": 2048, 00:24:39.913 "data_size": 63488 00:24:39.913 }, 00:24:39.913 { 00:24:39.913 "name": "BaseBdev2", 00:24:39.913 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:39.913 "is_configured": true, 00:24:39.913 "data_offset": 2048, 00:24:39.913 "data_size": 63488 00:24:39.913 }, 00:24:39.913 { 00:24:39.913 "name": "BaseBdev3", 00:24:39.913 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:39.913 "is_configured": true, 00:24:39.913 "data_offset": 2048, 00:24:39.913 "data_size": 63488 00:24:39.913 }, 00:24:39.913 { 00:24:39.913 "name": "BaseBdev4", 00:24:39.913 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:39.913 "is_configured": true, 00:24:39.913 "data_offset": 2048, 00:24:39.913 "data_size": 63488 00:24:39.913 } 00:24:39.913 ] 00:24:39.913 }' 00:24:39.913 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:39.913 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:40.481 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:24:40.481 16:03:45 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:24:40.740 [2024-06-10 16:03:46.202042] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:24:40.740 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=63488 00:24:40.740 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:40.740 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:24:40.999 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@618 -- # data_offset=2048 00:24:40.999 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@620 -- # '[' true = true ']' 00:24:40.999 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:24:40.999 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@622 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/spdk-raid.sock perform_tests 00:24:41.258 [2024-06-10 16:03:46.592849] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1270d40 00:24:41.258 I/O size of 3145728 is greater than zero copy threshold (65536). 00:24:41.258 Zero copy mechanism will not be used. 00:24:41.258 Running I/O for 60 seconds... 00:24:41.258 [2024-06-10 16:03:46.716706] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:24:41.258 [2024-06-10 16:03:46.716944] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1270d40 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:41.258 16:03:46 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:41.826 16:03:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:41.826 "name": "raid_bdev1", 00:24:41.826 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:41.826 "strip_size_kb": 0, 00:24:41.826 "state": "online", 00:24:41.826 "raid_level": "raid1", 00:24:41.826 "superblock": true, 00:24:41.826 "num_base_bdevs": 4, 00:24:41.826 "num_base_bdevs_discovered": 3, 00:24:41.826 "num_base_bdevs_operational": 3, 00:24:41.826 "base_bdevs_list": [ 00:24:41.826 { 00:24:41.826 "name": null, 00:24:41.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:41.826 "is_configured": false, 00:24:41.826 "data_offset": 2048, 00:24:41.826 "data_size": 63488 00:24:41.826 }, 00:24:41.826 { 00:24:41.826 "name": "BaseBdev2", 00:24:41.826 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:41.826 "is_configured": true, 00:24:41.826 "data_offset": 2048, 00:24:41.826 "data_size": 63488 00:24:41.826 }, 00:24:41.826 { 00:24:41.826 "name": "BaseBdev3", 00:24:41.826 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:41.826 "is_configured": true, 00:24:41.826 "data_offset": 2048, 00:24:41.826 "data_size": 63488 00:24:41.826 }, 00:24:41.826 { 00:24:41.826 "name": "BaseBdev4", 00:24:41.826 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:41.826 "is_configured": true, 00:24:41.826 "data_offset": 2048, 00:24:41.826 "data_size": 63488 00:24:41.826 } 00:24:41.826 ] 00:24:41.826 }' 00:24:41.826 16:03:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:41.826 16:03:47 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:42.394 16:03:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:42.653 [2024-06-10 16:03:47.923249] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:42.653 [2024-06-10 16:03:47.981630] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1309110 00:24:42.653 [2024-06-10 16:03:47.983898] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:42.653 16:03:47 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@646 -- # sleep 1 00:24:42.653 [2024-06-10 16:03:48.095557] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:42.653 [2024-06-10 16:03:48.096058] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:42.912 [2024-06-10 16:03:48.320655] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:42.912 [2024-06-10 16:03:48.320831] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:43.171 [2024-06-10 16:03:48.658407] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:43.171 [2024-06-10 16:03:48.658917] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:43.430 [2024-06-10 16:03:48.771819] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:43.430 [2024-06-10 16:03:48.772047] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:43.688 16:03:48 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:43.688 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:43.688 [2024-06-10 16:03:49.153792] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:43.688 [2024-06-10 16:03:49.153982] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 16384 offset_begin: 12288 offset_end: 18432 00:24:43.947 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:43.947 "name": "raid_bdev1", 00:24:43.947 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:43.947 "strip_size_kb": 0, 00:24:43.947 "state": "online", 00:24:43.947 "raid_level": "raid1", 00:24:43.947 "superblock": true, 00:24:43.947 "num_base_bdevs": 4, 00:24:43.947 "num_base_bdevs_discovered": 4, 00:24:43.947 "num_base_bdevs_operational": 4, 00:24:43.947 "process": { 00:24:43.947 "type": "rebuild", 00:24:43.947 "target": "spare", 00:24:43.947 "progress": { 00:24:43.947 "blocks": 16384, 00:24:43.947 "percent": 25 00:24:43.947 } 00:24:43.947 }, 00:24:43.947 "base_bdevs_list": [ 00:24:43.947 { 00:24:43.947 "name": "spare", 00:24:43.947 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:43.947 "is_configured": true, 00:24:43.947 "data_offset": 2048, 00:24:43.947 "data_size": 63488 00:24:43.947 }, 00:24:43.947 { 00:24:43.947 "name": "BaseBdev2", 00:24:43.947 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:43.947 "is_configured": true, 00:24:43.947 "data_offset": 2048, 00:24:43.947 "data_size": 63488 00:24:43.947 }, 00:24:43.947 { 00:24:43.947 "name": "BaseBdev3", 00:24:43.947 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:43.947 "is_configured": true, 00:24:43.947 "data_offset": 2048, 00:24:43.947 "data_size": 63488 00:24:43.947 }, 00:24:43.947 { 00:24:43.947 "name": "BaseBdev4", 00:24:43.947 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:43.947 "is_configured": true, 00:24:43.947 "data_offset": 2048, 00:24:43.947 "data_size": 63488 00:24:43.947 } 00:24:43.947 ] 00:24:43.947 }' 00:24:43.947 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:43.947 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:43.947 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:43.947 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:43.948 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:43.948 [2024-06-10 16:03:49.392613] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:43.948 [2024-06-10 16:03:49.393012] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:44.206 [2024-06-10 16:03:49.588836] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.206 [2024-06-10 16:03:49.618672] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:44.206 [2024-06-10 16:03:49.661041] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:44.206 [2024-06-10 16:03:49.683037] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:44.206 [2024-06-10 16:03:49.683069] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:44.206 [2024-06-10 16:03:49.683079] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:44.206 [2024-06-10 16:03:49.706569] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 0 raid_ch: 0x1270d40 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:44.465 16:03:49 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:44.724 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:44.724 "name": "raid_bdev1", 00:24:44.724 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:44.724 "strip_size_kb": 0, 00:24:44.724 "state": "online", 00:24:44.724 "raid_level": "raid1", 00:24:44.724 "superblock": true, 00:24:44.724 "num_base_bdevs": 4, 00:24:44.724 "num_base_bdevs_discovered": 3, 00:24:44.724 "num_base_bdevs_operational": 3, 00:24:44.724 "base_bdevs_list": [ 00:24:44.724 { 00:24:44.724 "name": null, 00:24:44.724 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:44.724 "is_configured": false, 00:24:44.724 "data_offset": 2048, 00:24:44.724 "data_size": 63488 00:24:44.724 }, 00:24:44.724 { 00:24:44.724 "name": "BaseBdev2", 00:24:44.724 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:44.724 "is_configured": true, 00:24:44.724 "data_offset": 2048, 00:24:44.724 "data_size": 63488 00:24:44.724 }, 00:24:44.724 { 00:24:44.724 "name": "BaseBdev3", 00:24:44.724 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:44.724 "is_configured": true, 00:24:44.724 "data_offset": 2048, 00:24:44.724 "data_size": 63488 00:24:44.724 }, 00:24:44.724 { 00:24:44.724 "name": "BaseBdev4", 00:24:44.724 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:44.724 "is_configured": true, 00:24:44.724 "data_offset": 2048, 00:24:44.724 "data_size": 63488 00:24:44.724 } 00:24:44.724 ] 00:24:44.724 }' 00:24:44.724 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:44.724 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:45.292 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:45.550 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:45.550 "name": "raid_bdev1", 00:24:45.550 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:45.550 "strip_size_kb": 0, 00:24:45.550 "state": "online", 00:24:45.550 "raid_level": "raid1", 00:24:45.550 "superblock": true, 00:24:45.550 "num_base_bdevs": 4, 00:24:45.550 "num_base_bdevs_discovered": 3, 00:24:45.550 "num_base_bdevs_operational": 3, 00:24:45.550 "base_bdevs_list": [ 00:24:45.550 { 00:24:45.550 "name": null, 00:24:45.550 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:45.550 "is_configured": false, 00:24:45.550 "data_offset": 2048, 00:24:45.550 "data_size": 63488 00:24:45.550 }, 00:24:45.550 { 00:24:45.550 "name": "BaseBdev2", 00:24:45.550 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:45.550 "is_configured": true, 00:24:45.550 "data_offset": 2048, 00:24:45.550 "data_size": 63488 00:24:45.550 }, 00:24:45.550 { 00:24:45.550 "name": "BaseBdev3", 00:24:45.550 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:45.550 "is_configured": true, 00:24:45.550 "data_offset": 2048, 00:24:45.550 "data_size": 63488 00:24:45.550 }, 00:24:45.550 { 00:24:45.550 "name": "BaseBdev4", 00:24:45.551 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:45.551 "is_configured": true, 00:24:45.551 "data_offset": 2048, 00:24:45.551 "data_size": 63488 00:24:45.551 } 00:24:45.551 ] 00:24:45.551 }' 00:24:45.551 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:45.551 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:45.551 16:03:50 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:45.551 16:03:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:45.551 16:03:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:45.810 [2024-06-10 16:03:51.284879] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:46.069 [2024-06-10 16:03:51.334087] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1416ce0 00:24:46.069 [2024-06-10 16:03:51.335699] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:46.069 16:03:51 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@662 -- # sleep 1 00:24:46.069 [2024-06-10 16:03:51.465110] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:46.069 [2024-06-10 16:03:51.474380] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 2048 offset_begin: 0 offset_end: 6144 00:24:46.328 [2024-06-10 16:03:51.697622] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:46.328 [2024-06-10 16:03:51.698236] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 4096 offset_begin: 0 offset_end: 6144 00:24:46.587 [2024-06-10 16:03:52.074950] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 8192 offset_begin: 6144 offset_end: 12288 00:24:46.846 [2024-06-10 16:03:52.196983] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 10240 offset_begin: 6144 offset_end: 12288 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:46.846 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.164 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:47.164 "name": "raid_bdev1", 00:24:47.164 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:47.164 "strip_size_kb": 0, 00:24:47.164 "state": "online", 00:24:47.164 "raid_level": "raid1", 00:24:47.164 "superblock": true, 00:24:47.164 "num_base_bdevs": 4, 00:24:47.164 "num_base_bdevs_discovered": 4, 00:24:47.164 "num_base_bdevs_operational": 4, 00:24:47.164 "process": { 00:24:47.164 "type": "rebuild", 00:24:47.164 "target": "spare", 00:24:47.164 "progress": { 00:24:47.164 "blocks": 14336, 00:24:47.164 "percent": 22 00:24:47.164 } 00:24:47.164 }, 00:24:47.164 "base_bdevs_list": [ 00:24:47.164 { 00:24:47.164 "name": "spare", 00:24:47.164 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:47.164 "is_configured": true, 00:24:47.164 "data_offset": 2048, 00:24:47.164 "data_size": 63488 00:24:47.164 }, 00:24:47.164 { 00:24:47.164 "name": "BaseBdev2", 00:24:47.164 "uuid": "3f9e250d-ba3d-5140-b398-c934fa6eba75", 00:24:47.164 "is_configured": true, 00:24:47.164 "data_offset": 2048, 00:24:47.164 "data_size": 63488 00:24:47.164 }, 00:24:47.164 { 00:24:47.164 "name": "BaseBdev3", 00:24:47.164 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:47.164 "is_configured": true, 00:24:47.164 "data_offset": 2048, 00:24:47.164 "data_size": 63488 00:24:47.164 }, 00:24:47.164 { 00:24:47.164 "name": "BaseBdev4", 00:24:47.164 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:47.164 "is_configured": true, 00:24:47.164 "data_offset": 2048, 00:24:47.164 "data_size": 63488 00:24:47.164 } 00:24:47.164 ] 00:24:47.164 }' 00:24:47.164 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:47.164 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:47.164 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:24:47.423 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=4 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@692 -- # '[' 4 -gt 2 ']' 00:24:47.423 16:03:52 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@694 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev2 00:24:47.423 [2024-06-10 16:03:52.905441] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:47.423 [2024-06-10 16:03:52.905747] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 20480 offset_begin: 18432 offset_end: 24576 00:24:47.682 [2024-06-10 16:03:52.940061] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:24:47.682 [2024-06-10 16:03:53.172917] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1270d40 00:24:47.682 [2024-06-10 16:03:53.172949] bdev_raid.c:1919:raid_bdev_channel_remove_base_bdev: *DEBUG*: slot: 1 raid_ch: 0x1416ce0 00:24:47.682 [2024-06-10 16:03:53.174218] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 22528 offset_begin: 18432 offset_end: 24576 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@697 -- # base_bdevs[1]= 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@698 -- # (( num_base_bdevs_operational-- )) 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@701 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:47.941 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:47.941 [2024-06-10 16:03:53.428948] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 26624 offset_begin: 24576 offset_end: 30720 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.200 "name": "raid_bdev1", 00:24:48.200 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:48.200 "strip_size_kb": 0, 00:24:48.200 "state": "online", 00:24:48.200 "raid_level": "raid1", 00:24:48.200 "superblock": true, 00:24:48.200 "num_base_bdevs": 4, 00:24:48.200 "num_base_bdevs_discovered": 3, 00:24:48.200 "num_base_bdevs_operational": 3, 00:24:48.200 "process": { 00:24:48.200 "type": "rebuild", 00:24:48.200 "target": "spare", 00:24:48.200 "progress": { 00:24:48.200 "blocks": 26624, 00:24:48.200 "percent": 41 00:24:48.200 } 00:24:48.200 }, 00:24:48.200 "base_bdevs_list": [ 00:24:48.200 { 00:24:48.200 "name": "spare", 00:24:48.200 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:48.200 "is_configured": true, 00:24:48.200 "data_offset": 2048, 00:24:48.200 "data_size": 63488 00:24:48.200 }, 00:24:48.200 { 00:24:48.200 "name": null, 00:24:48.200 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.200 "is_configured": false, 00:24:48.200 "data_offset": 2048, 00:24:48.200 "data_size": 63488 00:24:48.200 }, 00:24:48.200 { 00:24:48.200 "name": "BaseBdev3", 00:24:48.200 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:48.200 "is_configured": true, 00:24:48.200 "data_offset": 2048, 00:24:48.200 "data_size": 63488 00:24:48.200 }, 00:24:48.200 { 00:24:48.200 "name": "BaseBdev4", 00:24:48.200 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:48.200 "is_configured": true, 00:24:48.200 "data_offset": 2048, 00:24:48.200 "data_size": 63488 00:24:48.200 } 00:24:48.200 ] 00:24:48.200 }' 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.200 [2024-06-10 16:03:53.570232] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 28672 offset_begin: 24576 offset_end: 30720 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@705 -- # local timeout=955 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:48.200 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:48.460 "name": "raid_bdev1", 00:24:48.460 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:48.460 "strip_size_kb": 0, 00:24:48.460 "state": "online", 00:24:48.460 "raid_level": "raid1", 00:24:48.460 "superblock": true, 00:24:48.460 "num_base_bdevs": 4, 00:24:48.460 "num_base_bdevs_discovered": 3, 00:24:48.460 "num_base_bdevs_operational": 3, 00:24:48.460 "process": { 00:24:48.460 "type": "rebuild", 00:24:48.460 "target": "spare", 00:24:48.460 "progress": { 00:24:48.460 "blocks": 30720, 00:24:48.460 "percent": 48 00:24:48.460 } 00:24:48.460 }, 00:24:48.460 "base_bdevs_list": [ 00:24:48.460 { 00:24:48.460 "name": "spare", 00:24:48.460 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:48.460 "is_configured": true, 00:24:48.460 "data_offset": 2048, 00:24:48.460 "data_size": 63488 00:24:48.460 }, 00:24:48.460 { 00:24:48.460 "name": null, 00:24:48.460 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:48.460 "is_configured": false, 00:24:48.460 "data_offset": 2048, 00:24:48.460 "data_size": 63488 00:24:48.460 }, 00:24:48.460 { 00:24:48.460 "name": "BaseBdev3", 00:24:48.460 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:48.460 "is_configured": true, 00:24:48.460 "data_offset": 2048, 00:24:48.460 "data_size": 63488 00:24:48.460 }, 00:24:48.460 { 00:24:48.460 "name": "BaseBdev4", 00:24:48.460 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:48.460 "is_configured": true, 00:24:48.460 "data_offset": 2048, 00:24:48.460 "data_size": 63488 00:24:48.460 } 00:24:48.460 ] 00:24:48.460 }' 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:48.460 16:03:53 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:48.719 [2024-06-10 16:03:54.039605] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:48.719 [2024-06-10 16:03:54.040025] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 34816 offset_begin: 30720 offset_end: 36864 00:24:48.978 [2024-06-10 16:03:54.367571] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 38912 offset_begin: 36864 offset_end: 43008 00:24:49.237 [2024-06-10 16:03:54.713974] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 45056 offset_begin: 43008 offset_end: 49152 00:24:49.496 [2024-06-10 16:03:54.825286] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:49.496 [2024-06-10 16:03:54.825652] bdev_raid.c: 839:raid_bdev_submit_rw_request: *DEBUG*: split: process_offset: 47104 offset_begin: 43008 offset_end: 49152 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:49.496 16:03:54 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:49.755 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:49.755 "name": "raid_bdev1", 00:24:49.755 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:49.755 "strip_size_kb": 0, 00:24:49.755 "state": "online", 00:24:49.755 "raid_level": "raid1", 00:24:49.755 "superblock": true, 00:24:49.755 "num_base_bdevs": 4, 00:24:49.755 "num_base_bdevs_discovered": 3, 00:24:49.755 "num_base_bdevs_operational": 3, 00:24:49.755 "process": { 00:24:49.755 "type": "rebuild", 00:24:49.755 "target": "spare", 00:24:49.755 "progress": { 00:24:49.755 "blocks": 51200, 00:24:49.755 "percent": 80 00:24:49.755 } 00:24:49.755 }, 00:24:49.755 "base_bdevs_list": [ 00:24:49.755 { 00:24:49.755 "name": "spare", 00:24:49.755 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:49.755 "is_configured": true, 00:24:49.755 "data_offset": 2048, 00:24:49.755 "data_size": 63488 00:24:49.755 }, 00:24:49.755 { 00:24:49.755 "name": null, 00:24:49.755 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:49.755 "is_configured": false, 00:24:49.755 "data_offset": 2048, 00:24:49.755 "data_size": 63488 00:24:49.755 }, 00:24:49.755 { 00:24:49.755 "name": "BaseBdev3", 00:24:49.755 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:49.755 "is_configured": true, 00:24:49.755 "data_offset": 2048, 00:24:49.755 "data_size": 63488 00:24:49.755 }, 00:24:49.755 { 00:24:49.755 "name": "BaseBdev4", 00:24:49.755 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:49.755 "is_configured": true, 00:24:49.755 "data_offset": 2048, 00:24:49.755 "data_size": 63488 00:24:49.755 } 00:24:49.755 ] 00:24:49.755 }' 00:24:49.755 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:49.755 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:49.755 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:50.013 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:50.013 16:03:55 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@710 -- # sleep 1 00:24:50.271 [2024-06-10 16:03:55.776621] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:24:50.530 [2024-06-10 16:03:55.864689] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:24:50.530 [2024-06-10 16:03:55.868155] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:50.789 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.048 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.048 "name": "raid_bdev1", 00:24:51.048 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:51.048 "strip_size_kb": 0, 00:24:51.048 "state": "online", 00:24:51.048 "raid_level": "raid1", 00:24:51.048 "superblock": true, 00:24:51.048 "num_base_bdevs": 4, 00:24:51.048 "num_base_bdevs_discovered": 3, 00:24:51.048 "num_base_bdevs_operational": 3, 00:24:51.048 "base_bdevs_list": [ 00:24:51.048 { 00:24:51.048 "name": "spare", 00:24:51.048 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:51.048 "is_configured": true, 00:24:51.048 "data_offset": 2048, 00:24:51.048 "data_size": 63488 00:24:51.048 }, 00:24:51.048 { 00:24:51.048 "name": null, 00:24:51.048 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.048 "is_configured": false, 00:24:51.048 "data_offset": 2048, 00:24:51.048 "data_size": 63488 00:24:51.048 }, 00:24:51.048 { 00:24:51.048 "name": "BaseBdev3", 00:24:51.048 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:51.048 "is_configured": true, 00:24:51.048 "data_offset": 2048, 00:24:51.048 "data_size": 63488 00:24:51.048 }, 00:24:51.048 { 00:24:51.048 "name": "BaseBdev4", 00:24:51.048 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:51.048 "is_configured": true, 00:24:51.048 "data_offset": 2048, 00:24:51.048 "data_size": 63488 00:24:51.048 } 00:24:51.048 ] 00:24:51.048 }' 00:24:51.048 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@708 -- # break 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.307 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.566 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:51.566 "name": "raid_bdev1", 00:24:51.566 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:51.566 "strip_size_kb": 0, 00:24:51.566 "state": "online", 00:24:51.566 "raid_level": "raid1", 00:24:51.566 "superblock": true, 00:24:51.566 "num_base_bdevs": 4, 00:24:51.566 "num_base_bdevs_discovered": 3, 00:24:51.567 "num_base_bdevs_operational": 3, 00:24:51.567 "base_bdevs_list": [ 00:24:51.567 { 00:24:51.567 "name": "spare", 00:24:51.567 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:51.567 "is_configured": true, 00:24:51.567 "data_offset": 2048, 00:24:51.567 "data_size": 63488 00:24:51.567 }, 00:24:51.567 { 00:24:51.567 "name": null, 00:24:51.567 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.567 "is_configured": false, 00:24:51.567 "data_offset": 2048, 00:24:51.567 "data_size": 63488 00:24:51.567 }, 00:24:51.567 { 00:24:51.567 "name": "BaseBdev3", 00:24:51.567 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:51.567 "is_configured": true, 00:24:51.567 "data_offset": 2048, 00:24:51.567 "data_size": 63488 00:24:51.567 }, 00:24:51.567 { 00:24:51.567 "name": "BaseBdev4", 00:24:51.567 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:51.567 "is_configured": true, 00:24:51.567 "data_offset": 2048, 00:24:51.567 "data_size": 63488 00:24:51.567 } 00:24:51.567 ] 00:24:51.567 }' 00:24:51.567 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:51.567 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:51.567 16:03:56 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:51.567 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:51.826 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:51.826 "name": "raid_bdev1", 00:24:51.826 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:51.826 "strip_size_kb": 0, 00:24:51.826 "state": "online", 00:24:51.826 "raid_level": "raid1", 00:24:51.826 "superblock": true, 00:24:51.826 "num_base_bdevs": 4, 00:24:51.826 "num_base_bdevs_discovered": 3, 00:24:51.826 "num_base_bdevs_operational": 3, 00:24:51.826 "base_bdevs_list": [ 00:24:51.826 { 00:24:51.826 "name": "spare", 00:24:51.826 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:51.826 "is_configured": true, 00:24:51.826 "data_offset": 2048, 00:24:51.826 "data_size": 63488 00:24:51.826 }, 00:24:51.826 { 00:24:51.826 "name": null, 00:24:51.826 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:51.826 "is_configured": false, 00:24:51.826 "data_offset": 2048, 00:24:51.826 "data_size": 63488 00:24:51.826 }, 00:24:51.826 { 00:24:51.826 "name": "BaseBdev3", 00:24:51.826 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:51.826 "is_configured": true, 00:24:51.826 "data_offset": 2048, 00:24:51.826 "data_size": 63488 00:24:51.826 }, 00:24:51.826 { 00:24:51.826 "name": "BaseBdev4", 00:24:51.826 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:51.826 "is_configured": true, 00:24:51.826 "data_offset": 2048, 00:24:51.826 "data_size": 63488 00:24:51.826 } 00:24:51.826 ] 00:24:51.826 }' 00:24:51.826 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:51.826 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:52.394 16:03:57 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:24:52.653 [2024-06-10 16:03:58.030732] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:24:52.653 [2024-06-10 16:03:58.030765] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:24:52.653 00:24:52.653 Latency(us) 00:24:52.653 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:24:52.653 Job: raid_bdev1 (Core Mask 0x1, workload: randrw, percentage: 50, depth: 2, IO size: 3145728) 00:24:52.653 raid_bdev1 : 11.48 91.52 274.56 0.00 0.00 13898.78 292.57 116841.33 00:24:52.653 =================================================================================================================== 00:24:52.653 Total : 91.52 274.56 0.00 0.00 13898.78 292.57 116841.33 00:24:52.653 [2024-06-10 16:03:58.111107] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:52.653 [2024-06-10 16:03:58.111137] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:24:52.653 [2024-06-10 16:03:58.111237] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:24:52.653 [2024-06-10 16:03:58.111252] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x13a3260 name raid_bdev1, state offline 00:24:52.653 0 00:24:52.653 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:52.653 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # jq length 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@722 -- # '[' true = true ']' 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@724 -- # nbd_start_disks /var/tmp/spdk-raid.sock spare /dev/nbd0 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('spare') 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:52.912 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd0 00:24:53.171 /dev/nbd0 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.171 1+0 records in 00:24:53.171 1+0 records out 00:24:53.171 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000240837 s, 17.0 MB/s 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:24:53.171 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z '' ']' 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@727 -- # continue 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev3 ']' 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev3 /dev/nbd1 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev3') 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev3 /dev/nbd1 00:24:53.431 /dev/nbd1 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.431 1+0 records in 00:24:53.431 1+0 records out 00:24:53.431 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000218419 s, 18.8 MB/s 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.431 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:53.690 16:03:58 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@725 -- # for bdev in "${base_bdevs[@]:1}" 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@726 -- # '[' -z BaseBdev4 ']' 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@729 -- # nbd_start_disks /var/tmp/spdk-raid.sock BaseBdev4 /dev/nbd1 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev4') 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@10 -- # local bdev_list 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd1') 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@11 -- # local nbd_list 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@12 -- # local i 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.690 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev4 /dev/nbd1 00:24:53.949 /dev/nbd1 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@868 -- # local i 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@872 -- # break 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:24:53.949 1+0 records in 00:24:53.949 1+0 records out 00:24:53.949 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000213191 s, 19.2 MB/s 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@885 -- # size=4096 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@888 -- # return 0 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:24:53.949 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@730 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@731 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd1') 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@733 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@50 -- # local nbd_list 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@51 -- # local i 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:24:54.208 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@41 -- # break 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/nbd_common.sh@45 -- # return 0 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:24:54.467 16:03:59 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:54.726 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:24:54.984 [2024-06-10 16:04:00.435215] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:24:54.984 [2024-06-10 16:04:00.435260] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:24:54.984 [2024-06-10 16:04:00.435277] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1271420 00:24:54.984 [2024-06-10 16:04:00.435287] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:24:54.984 [2024-06-10 16:04:00.437044] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:24:54.984 [2024-06-10 16:04:00.437075] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:24:54.984 [2024-06-10 16:04:00.437155] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:24:54.984 [2024-06-10 16:04:00.437181] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:54.984 [2024-06-10 16:04:00.437290] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev3 is claimed 00:24:54.984 [2024-06-10 16:04:00.437368] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev4 is claimed 00:24:54.984 spare 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 3 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=3 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:54.984 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:55.243 [2024-06-10 16:04:00.537688] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x141c780 00:24:55.243 [2024-06-10 16:04:00.537704] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 63488, blocklen 512 00:24:55.243 [2024-06-10 16:04:00.537910] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x141c0d0 00:24:55.243 [2024-06-10 16:04:00.538077] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x141c780 00:24:55.243 [2024-06-10 16:04:00.538086] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x141c780 00:24:55.243 [2024-06-10 16:04:00.538202] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:55.243 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:55.243 "name": "raid_bdev1", 00:24:55.243 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:55.243 "strip_size_kb": 0, 00:24:55.243 "state": "online", 00:24:55.243 "raid_level": "raid1", 00:24:55.243 "superblock": true, 00:24:55.243 "num_base_bdevs": 4, 00:24:55.243 "num_base_bdevs_discovered": 3, 00:24:55.243 "num_base_bdevs_operational": 3, 00:24:55.243 "base_bdevs_list": [ 00:24:55.243 { 00:24:55.243 "name": "spare", 00:24:55.243 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:55.243 "is_configured": true, 00:24:55.243 "data_offset": 2048, 00:24:55.243 "data_size": 63488 00:24:55.243 }, 00:24:55.243 { 00:24:55.243 "name": null, 00:24:55.243 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:55.243 "is_configured": false, 00:24:55.243 "data_offset": 2048, 00:24:55.243 "data_size": 63488 00:24:55.243 }, 00:24:55.243 { 00:24:55.243 "name": "BaseBdev3", 00:24:55.243 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:55.243 "is_configured": true, 00:24:55.243 "data_offset": 2048, 00:24:55.243 "data_size": 63488 00:24:55.243 }, 00:24:55.243 { 00:24:55.243 "name": "BaseBdev4", 00:24:55.243 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:55.243 "is_configured": true, 00:24:55.243 "data_offset": 2048, 00:24:55.243 "data_size": 63488 00:24:55.243 } 00:24:55.243 ] 00:24:55.243 }' 00:24:55.243 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:55.243 16:04:00 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:55.809 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.067 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:56.067 "name": "raid_bdev1", 00:24:56.067 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:56.067 "strip_size_kb": 0, 00:24:56.067 "state": "online", 00:24:56.067 "raid_level": "raid1", 00:24:56.067 "superblock": true, 00:24:56.067 "num_base_bdevs": 4, 00:24:56.067 "num_base_bdevs_discovered": 3, 00:24:56.068 "num_base_bdevs_operational": 3, 00:24:56.068 "base_bdevs_list": [ 00:24:56.068 { 00:24:56.068 "name": "spare", 00:24:56.068 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:56.068 "is_configured": true, 00:24:56.068 "data_offset": 2048, 00:24:56.068 "data_size": 63488 00:24:56.068 }, 00:24:56.068 { 00:24:56.068 "name": null, 00:24:56.068 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.068 "is_configured": false, 00:24:56.068 "data_offset": 2048, 00:24:56.068 "data_size": 63488 00:24:56.068 }, 00:24:56.068 { 00:24:56.068 "name": "BaseBdev3", 00:24:56.068 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:56.068 "is_configured": true, 00:24:56.068 "data_offset": 2048, 00:24:56.068 "data_size": 63488 00:24:56.068 }, 00:24:56.068 { 00:24:56.068 "name": "BaseBdev4", 00:24:56.068 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:56.068 "is_configured": true, 00:24:56.068 "data_offset": 2048, 00:24:56.068 "data_size": 63488 00:24:56.068 } 00:24:56.068 ] 00:24:56.068 }' 00:24:56.068 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:56.068 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:24:56.068 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:56.326 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:24:56.326 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.326 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:24:56.326 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:24:56.326 16:04:01 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:24:56.586 [2024-06-10 16:04:02.068072] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:56.586 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:56.848 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:56.849 "name": "raid_bdev1", 00:24:56.849 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:56.849 "strip_size_kb": 0, 00:24:56.849 "state": "online", 00:24:56.849 "raid_level": "raid1", 00:24:56.849 "superblock": true, 00:24:56.849 "num_base_bdevs": 4, 00:24:56.849 "num_base_bdevs_discovered": 2, 00:24:56.849 "num_base_bdevs_operational": 2, 00:24:56.849 "base_bdevs_list": [ 00:24:56.849 { 00:24:56.849 "name": null, 00:24:56.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.849 "is_configured": false, 00:24:56.849 "data_offset": 2048, 00:24:56.849 "data_size": 63488 00:24:56.849 }, 00:24:56.849 { 00:24:56.849 "name": null, 00:24:56.849 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:56.849 "is_configured": false, 00:24:56.849 "data_offset": 2048, 00:24:56.849 "data_size": 63488 00:24:56.849 }, 00:24:56.849 { 00:24:56.849 "name": "BaseBdev3", 00:24:56.849 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:56.849 "is_configured": true, 00:24:56.849 "data_offset": 2048, 00:24:56.849 "data_size": 63488 00:24:56.849 }, 00:24:56.849 { 00:24:56.849 "name": "BaseBdev4", 00:24:56.849 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:56.849 "is_configured": true, 00:24:56.849 "data_offset": 2048, 00:24:56.849 "data_size": 63488 00:24:56.849 } 00:24:56.849 ] 00:24:56.849 }' 00:24:56.849 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:56.849 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:24:57.419 16:04:02 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:24:57.677 [2024-06-10 16:04:03.143127] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.677 [2024-06-10 16:04:03.143287] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:24:57.677 [2024-06-10 16:04:03.143301] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:24:57.677 [2024-06-10 16:04:03.143328] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:24:57.677 [2024-06-10 16:04:03.147585] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x13a8670 00:24:57.677 [2024-06-10 16:04:03.149737] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:24:57.677 16:04:03 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@755 -- # sleep 1 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:24:59.055 "name": "raid_bdev1", 00:24:59.055 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:59.055 "strip_size_kb": 0, 00:24:59.055 "state": "online", 00:24:59.055 "raid_level": "raid1", 00:24:59.055 "superblock": true, 00:24:59.055 "num_base_bdevs": 4, 00:24:59.055 "num_base_bdevs_discovered": 3, 00:24:59.055 "num_base_bdevs_operational": 3, 00:24:59.055 "process": { 00:24:59.055 "type": "rebuild", 00:24:59.055 "target": "spare", 00:24:59.055 "progress": { 00:24:59.055 "blocks": 24576, 00:24:59.055 "percent": 38 00:24:59.055 } 00:24:59.055 }, 00:24:59.055 "base_bdevs_list": [ 00:24:59.055 { 00:24:59.055 "name": "spare", 00:24:59.055 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:24:59.055 "is_configured": true, 00:24:59.055 "data_offset": 2048, 00:24:59.055 "data_size": 63488 00:24:59.055 }, 00:24:59.055 { 00:24:59.055 "name": null, 00:24:59.055 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.055 "is_configured": false, 00:24:59.055 "data_offset": 2048, 00:24:59.055 "data_size": 63488 00:24:59.055 }, 00:24:59.055 { 00:24:59.055 "name": "BaseBdev3", 00:24:59.055 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:59.055 "is_configured": true, 00:24:59.055 "data_offset": 2048, 00:24:59.055 "data_size": 63488 00:24:59.055 }, 00:24:59.055 { 00:24:59.055 "name": "BaseBdev4", 00:24:59.055 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:59.055 "is_configured": true, 00:24:59.055 "data_offset": 2048, 00:24:59.055 "data_size": 63488 00:24:59.055 } 00:24:59.055 ] 00:24:59.055 }' 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:24:59.055 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:24:59.315 [2024-06-10 16:04:04.738658] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:59.315 [2024-06-10 16:04:04.762187] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:24:59.315 [2024-06-10 16:04:04.762230] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:24:59.315 [2024-06-10 16:04:04.762245] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:24:59.315 [2024-06-10 16:04:04.762251] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:24:59.315 16:04:04 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:24:59.573 16:04:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:24:59.573 "name": "raid_bdev1", 00:24:59.573 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:24:59.573 "strip_size_kb": 0, 00:24:59.573 "state": "online", 00:24:59.573 "raid_level": "raid1", 00:24:59.573 "superblock": true, 00:24:59.574 "num_base_bdevs": 4, 00:24:59.574 "num_base_bdevs_discovered": 2, 00:24:59.574 "num_base_bdevs_operational": 2, 00:24:59.574 "base_bdevs_list": [ 00:24:59.574 { 00:24:59.574 "name": null, 00:24:59.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.574 "is_configured": false, 00:24:59.574 "data_offset": 2048, 00:24:59.574 "data_size": 63488 00:24:59.574 }, 00:24:59.574 { 00:24:59.574 "name": null, 00:24:59.574 "uuid": "00000000-0000-0000-0000-000000000000", 00:24:59.574 "is_configured": false, 00:24:59.574 "data_offset": 2048, 00:24:59.574 "data_size": 63488 00:24:59.574 }, 00:24:59.574 { 00:24:59.574 "name": "BaseBdev3", 00:24:59.574 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:24:59.574 "is_configured": true, 00:24:59.574 "data_offset": 2048, 00:24:59.574 "data_size": 63488 00:24:59.574 }, 00:24:59.574 { 00:24:59.574 "name": "BaseBdev4", 00:24:59.574 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:24:59.574 "is_configured": true, 00:24:59.574 "data_offset": 2048, 00:24:59.574 "data_size": 63488 00:24:59.574 } 00:24:59.574 ] 00:24:59.574 }' 00:24:59.574 16:04:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:24:59.574 16:04:05 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:00.509 16:04:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:00.509 [2024-06-10 16:04:05.909563] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:00.509 [2024-06-10 16:04:05.909611] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:00.509 [2024-06-10 16:04:05.909630] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13a58d0 00:25:00.509 [2024-06-10 16:04:05.909639] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:00.509 [2024-06-10 16:04:05.910040] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:00.509 [2024-06-10 16:04:05.910058] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:00.509 [2024-06-10 16:04:05.910143] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:00.509 [2024-06-10 16:04:05.910153] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (5) smaller than existing raid bdev raid_bdev1 (6) 00:25:00.509 [2024-06-10 16:04:05.910161] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:00.509 [2024-06-10 16:04:05.910176] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:00.509 [2024-06-10 16:04:05.914470] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1420b20 00:25:00.509 spare 00:25:00.509 [2024-06-10 16:04:05.915986] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:00.509 16:04:05 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@762 -- # sleep 1 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:01.445 16:04:06 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:01.735 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:01.735 "name": "raid_bdev1", 00:25:01.735 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:01.735 "strip_size_kb": 0, 00:25:01.735 "state": "online", 00:25:01.735 "raid_level": "raid1", 00:25:01.735 "superblock": true, 00:25:01.735 "num_base_bdevs": 4, 00:25:01.735 "num_base_bdevs_discovered": 3, 00:25:01.735 "num_base_bdevs_operational": 3, 00:25:01.735 "process": { 00:25:01.735 "type": "rebuild", 00:25:01.735 "target": "spare", 00:25:01.735 "progress": { 00:25:01.735 "blocks": 24576, 00:25:01.735 "percent": 38 00:25:01.735 } 00:25:01.735 }, 00:25:01.735 "base_bdevs_list": [ 00:25:01.735 { 00:25:01.735 "name": "spare", 00:25:01.735 "uuid": "c94562cb-e3fa-5c12-831d-324f0c4a37be", 00:25:01.735 "is_configured": true, 00:25:01.735 "data_offset": 2048, 00:25:01.735 "data_size": 63488 00:25:01.735 }, 00:25:01.735 { 00:25:01.735 "name": null, 00:25:01.735 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:01.735 "is_configured": false, 00:25:01.735 "data_offset": 2048, 00:25:01.735 "data_size": 63488 00:25:01.735 }, 00:25:01.735 { 00:25:01.735 "name": "BaseBdev3", 00:25:01.735 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:01.735 "is_configured": true, 00:25:01.735 "data_offset": 2048, 00:25:01.735 "data_size": 63488 00:25:01.735 }, 00:25:01.735 { 00:25:01.735 "name": "BaseBdev4", 00:25:01.735 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:01.735 "is_configured": true, 00:25:01.735 "data_offset": 2048, 00:25:01.735 "data_size": 63488 00:25:01.735 } 00:25:01.735 ] 00:25:01.735 }' 00:25:01.735 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:01.994 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:01.994 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:01.994 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:01.994 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:02.252 [2024-06-10 16:04:07.520437] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:02.252 [2024-06-10 16:04:07.528399] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:02.252 [2024-06-10 16:04:07.528439] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:02.252 [2024-06-10 16:04:07.528454] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:02.252 [2024-06-10 16:04:07.528460] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:02.252 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:02.510 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:02.510 "name": "raid_bdev1", 00:25:02.510 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:02.510 "strip_size_kb": 0, 00:25:02.510 "state": "online", 00:25:02.510 "raid_level": "raid1", 00:25:02.510 "superblock": true, 00:25:02.510 "num_base_bdevs": 4, 00:25:02.510 "num_base_bdevs_discovered": 2, 00:25:02.510 "num_base_bdevs_operational": 2, 00:25:02.510 "base_bdevs_list": [ 00:25:02.510 { 00:25:02.510 "name": null, 00:25:02.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.510 "is_configured": false, 00:25:02.510 "data_offset": 2048, 00:25:02.510 "data_size": 63488 00:25:02.510 }, 00:25:02.510 { 00:25:02.510 "name": null, 00:25:02.510 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:02.510 "is_configured": false, 00:25:02.510 "data_offset": 2048, 00:25:02.510 "data_size": 63488 00:25:02.510 }, 00:25:02.510 { 00:25:02.510 "name": "BaseBdev3", 00:25:02.510 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:02.510 "is_configured": true, 00:25:02.510 "data_offset": 2048, 00:25:02.510 "data_size": 63488 00:25:02.510 }, 00:25:02.510 { 00:25:02.510 "name": "BaseBdev4", 00:25:02.510 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:02.510 "is_configured": true, 00:25:02.510 "data_offset": 2048, 00:25:02.510 "data_size": 63488 00:25:02.510 } 00:25:02.510 ] 00:25:02.510 }' 00:25:02.510 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:02.510 16:04:07 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:03.076 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:03.335 "name": "raid_bdev1", 00:25:03.335 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:03.335 "strip_size_kb": 0, 00:25:03.335 "state": "online", 00:25:03.335 "raid_level": "raid1", 00:25:03.335 "superblock": true, 00:25:03.335 "num_base_bdevs": 4, 00:25:03.335 "num_base_bdevs_discovered": 2, 00:25:03.335 "num_base_bdevs_operational": 2, 00:25:03.335 "base_bdevs_list": [ 00:25:03.335 { 00:25:03.335 "name": null, 00:25:03.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.335 "is_configured": false, 00:25:03.335 "data_offset": 2048, 00:25:03.335 "data_size": 63488 00:25:03.335 }, 00:25:03.335 { 00:25:03.335 "name": null, 00:25:03.335 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:03.335 "is_configured": false, 00:25:03.335 "data_offset": 2048, 00:25:03.335 "data_size": 63488 00:25:03.335 }, 00:25:03.335 { 00:25:03.335 "name": "BaseBdev3", 00:25:03.335 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:03.335 "is_configured": true, 00:25:03.335 "data_offset": 2048, 00:25:03.335 "data_size": 63488 00:25:03.335 }, 00:25:03.335 { 00:25:03.335 "name": "BaseBdev4", 00:25:03.335 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:03.335 "is_configured": true, 00:25:03.335 "data_offset": 2048, 00:25:03.335 "data_size": 63488 00:25:03.335 } 00:25:03.335 ] 00:25:03.335 }' 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:03.335 16:04:08 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:25:03.594 16:04:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:03.852 [2024-06-10 16:04:09.261412] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:03.852 [2024-06-10 16:04:09.261457] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:03.852 [2024-06-10 16:04:09.261473] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x141bc50 00:25:03.852 [2024-06-10 16:04:09.261482] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:03.852 [2024-06-10 16:04:09.261844] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:03.852 [2024-06-10 16:04:09.261861] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:03.852 [2024-06-10 16:04:09.261925] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:25:03.852 [2024-06-10 16:04:09.261936] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:03.852 [2024-06-10 16:04:09.261944] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:03.852 BaseBdev1 00:25:03.852 16:04:09 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@773 -- # sleep 1 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:04.786 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.045 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:05.045 "name": "raid_bdev1", 00:25:05.045 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:05.045 "strip_size_kb": 0, 00:25:05.045 "state": "online", 00:25:05.045 "raid_level": "raid1", 00:25:05.045 "superblock": true, 00:25:05.045 "num_base_bdevs": 4, 00:25:05.045 "num_base_bdevs_discovered": 2, 00:25:05.045 "num_base_bdevs_operational": 2, 00:25:05.045 "base_bdevs_list": [ 00:25:05.045 { 00:25:05.045 "name": null, 00:25:05.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.045 "is_configured": false, 00:25:05.045 "data_offset": 2048, 00:25:05.045 "data_size": 63488 00:25:05.045 }, 00:25:05.045 { 00:25:05.045 "name": null, 00:25:05.045 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.045 "is_configured": false, 00:25:05.045 "data_offset": 2048, 00:25:05.045 "data_size": 63488 00:25:05.045 }, 00:25:05.045 { 00:25:05.045 "name": "BaseBdev3", 00:25:05.045 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:05.045 "is_configured": true, 00:25:05.045 "data_offset": 2048, 00:25:05.045 "data_size": 63488 00:25:05.045 }, 00:25:05.045 { 00:25:05.045 "name": "BaseBdev4", 00:25:05.045 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:05.045 "is_configured": true, 00:25:05.045 "data_offset": 2048, 00:25:05.045 "data_size": 63488 00:25:05.045 } 00:25:05.045 ] 00:25:05.045 }' 00:25:05.045 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:05.045 16:04:10 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:05.981 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:05.981 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:05.981 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:05.981 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:05.982 "name": "raid_bdev1", 00:25:05.982 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:05.982 "strip_size_kb": 0, 00:25:05.982 "state": "online", 00:25:05.982 "raid_level": "raid1", 00:25:05.982 "superblock": true, 00:25:05.982 "num_base_bdevs": 4, 00:25:05.982 "num_base_bdevs_discovered": 2, 00:25:05.982 "num_base_bdevs_operational": 2, 00:25:05.982 "base_bdevs_list": [ 00:25:05.982 { 00:25:05.982 "name": null, 00:25:05.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.982 "is_configured": false, 00:25:05.982 "data_offset": 2048, 00:25:05.982 "data_size": 63488 00:25:05.982 }, 00:25:05.982 { 00:25:05.982 "name": null, 00:25:05.982 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:05.982 "is_configured": false, 00:25:05.982 "data_offset": 2048, 00:25:05.982 "data_size": 63488 00:25:05.982 }, 00:25:05.982 { 00:25:05.982 "name": "BaseBdev3", 00:25:05.982 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:05.982 "is_configured": true, 00:25:05.982 "data_offset": 2048, 00:25:05.982 "data_size": 63488 00:25:05.982 }, 00:25:05.982 { 00:25:05.982 "name": "BaseBdev4", 00:25:05.982 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:05.982 "is_configured": true, 00:25:05.982 "data_offset": 2048, 00:25:05.982 "data_size": 63488 00:25:05.982 } 00:25:05.982 ] 00:25:05.982 }' 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:05.982 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@649 -- # local es=0 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:06.241 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:25:06.241 [2024-06-10 16:04:11.740431] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:06.241 [2024-06-10 16:04:11.740560] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (6) 00:25:06.241 [2024-06-10 16:04:11.740574] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:25:06.241 request: 00:25:06.241 { 00:25:06.241 "raid_bdev": "raid_bdev1", 00:25:06.241 "base_bdev": "BaseBdev1", 00:25:06.241 "method": "bdev_raid_add_base_bdev", 00:25:06.241 "req_id": 1 00:25:06.241 } 00:25:06.241 Got JSON-RPC error response 00:25:06.241 response: 00:25:06.241 { 00:25:06.241 "code": -22, 00:25:06.241 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:25:06.241 } 00:25:06.499 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@652 -- # es=1 00:25:06.499 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:06.499 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:06.499 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:06.499 16:04:11 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@777 -- # sleep 1 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:07.435 16:04:12 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:07.693 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:07.693 "name": "raid_bdev1", 00:25:07.693 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:07.693 "strip_size_kb": 0, 00:25:07.693 "state": "online", 00:25:07.693 "raid_level": "raid1", 00:25:07.693 "superblock": true, 00:25:07.693 "num_base_bdevs": 4, 00:25:07.693 "num_base_bdevs_discovered": 2, 00:25:07.693 "num_base_bdevs_operational": 2, 00:25:07.693 "base_bdevs_list": [ 00:25:07.693 { 00:25:07.693 "name": null, 00:25:07.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.693 "is_configured": false, 00:25:07.693 "data_offset": 2048, 00:25:07.693 "data_size": 63488 00:25:07.693 }, 00:25:07.693 { 00:25:07.693 "name": null, 00:25:07.693 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:07.693 "is_configured": false, 00:25:07.693 "data_offset": 2048, 00:25:07.693 "data_size": 63488 00:25:07.693 }, 00:25:07.693 { 00:25:07.693 "name": "BaseBdev3", 00:25:07.693 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:07.693 "is_configured": true, 00:25:07.693 "data_offset": 2048, 00:25:07.694 "data_size": 63488 00:25:07.694 }, 00:25:07.694 { 00:25:07.694 "name": "BaseBdev4", 00:25:07.694 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:07.694 "is_configured": true, 00:25:07.694 "data_offset": 2048, 00:25:07.694 "data_size": 63488 00:25:07.694 } 00:25:07.694 ] 00:25:07.694 }' 00:25:07.694 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:07.694 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:08.260 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:08.518 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:08.518 "name": "raid_bdev1", 00:25:08.518 "uuid": "a982d62c-d46b-4687-b777-f3a6da832584", 00:25:08.518 "strip_size_kb": 0, 00:25:08.518 "state": "online", 00:25:08.518 "raid_level": "raid1", 00:25:08.518 "superblock": true, 00:25:08.518 "num_base_bdevs": 4, 00:25:08.518 "num_base_bdevs_discovered": 2, 00:25:08.518 "num_base_bdevs_operational": 2, 00:25:08.518 "base_bdevs_list": [ 00:25:08.519 { 00:25:08.519 "name": null, 00:25:08.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.519 "is_configured": false, 00:25:08.519 "data_offset": 2048, 00:25:08.519 "data_size": 63488 00:25:08.519 }, 00:25:08.519 { 00:25:08.519 "name": null, 00:25:08.519 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:08.519 "is_configured": false, 00:25:08.519 "data_offset": 2048, 00:25:08.519 "data_size": 63488 00:25:08.519 }, 00:25:08.519 { 00:25:08.519 "name": "BaseBdev3", 00:25:08.519 "uuid": "40dd4399-940d-5fdb-b3bb-40341b755478", 00:25:08.519 "is_configured": true, 00:25:08.519 "data_offset": 2048, 00:25:08.519 "data_size": 63488 00:25:08.519 }, 00:25:08.519 { 00:25:08.519 "name": "BaseBdev4", 00:25:08.519 "uuid": "06aa641b-759e-541c-b5ff-28db99644d16", 00:25:08.519 "is_configured": true, 00:25:08.519 "data_offset": 2048, 00:25:08.519 "data_size": 63488 00:25:08.519 } 00:25:08.519 ] 00:25:08.519 }' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@782 -- # killprocess 2796429 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@949 -- # '[' -z 2796429 ']' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@953 -- # kill -0 2796429 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # uname 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2796429 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2796429' 00:25:08.519 killing process with pid 2796429 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@968 -- # kill 2796429 00:25:08.519 Received shutdown signal, test time was about 27.290803 seconds 00:25:08.519 00:25:08.519 Latency(us) 00:25:08.519 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:25:08.519 =================================================================================================================== 00:25:08.519 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:25:08.519 [2024-06-10 16:04:13.952661] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:08.519 [2024-06-10 16:04:13.952765] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:08.519 16:04:13 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@973 -- # wait 2796429 00:25:08.519 [2024-06-10 16:04:13.952830] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:08.519 [2024-06-10 16:04:13.952840] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x141c780 name raid_bdev1, state offline 00:25:08.519 [2024-06-10 16:04:13.989738] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:08.778 16:04:14 bdev_raid.raid_rebuild_test_sb_io -- bdev/bdev_raid.sh@784 -- # return 0 00:25:08.778 00:25:08.778 real 0m32.201s 00:25:08.778 user 0m51.872s 00:25:08.778 sys 0m3.951s 00:25:08.778 16:04:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:08.778 16:04:14 bdev_raid.raid_rebuild_test_sb_io -- common/autotest_common.sh@10 -- # set +x 00:25:08.778 ************************************ 00:25:08.778 END TEST raid_rebuild_test_sb_io 00:25:08.778 ************************************ 00:25:08.778 16:04:14 bdev_raid -- bdev/bdev_raid.sh@884 -- # '[' n == y ']' 00:25:08.778 16:04:14 bdev_raid -- bdev/bdev_raid.sh@896 -- # base_blocklen=4096 00:25:08.778 16:04:14 bdev_raid -- bdev/bdev_raid.sh@898 -- # run_test raid_state_function_test_sb_4k raid_state_function_test raid1 2 true 00:25:08.778 16:04:14 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:25:08.778 16:04:14 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:08.778 16:04:14 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:08.778 ************************************ 00:25:08.778 START TEST raid_state_function_test_sb_4k 00:25:08.778 ************************************ 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:25:08.778 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@226 -- # local strip_size 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@244 -- # raid_pid=2802109 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2802109' 00:25:08.779 Process raid pid: 2802109 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@246 -- # waitforlisten 2802109 /var/tmp/spdk-raid.sock 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 2802109 ']' 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:08.779 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:08.779 16:04:14 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:09.037 [2024-06-10 16:04:14.326154] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:25:09.037 [2024-06-10 16:04:14.326209] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:25:09.037 [2024-06-10 16:04:14.423638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:09.037 [2024-06-10 16:04:14.518303] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:09.296 [2024-06-10 16:04:14.576827] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:09.296 [2024-06-10 16:04:14.576857] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:09.864 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:09.864 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:25:09.864 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:10.123 [2024-06-10 16:04:15.503715] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:10.123 [2024-06-10 16:04:15.503754] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:10.123 [2024-06-10 16:04:15.503763] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:10.123 [2024-06-10 16:04:15.503772] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:10.123 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:10.382 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:10.382 "name": "Existed_Raid", 00:25:10.382 "uuid": "185d58e8-f96f-40a9-829f-361583f5601c", 00:25:10.382 "strip_size_kb": 0, 00:25:10.382 "state": "configuring", 00:25:10.382 "raid_level": "raid1", 00:25:10.382 "superblock": true, 00:25:10.382 "num_base_bdevs": 2, 00:25:10.382 "num_base_bdevs_discovered": 0, 00:25:10.382 "num_base_bdevs_operational": 2, 00:25:10.382 "base_bdevs_list": [ 00:25:10.382 { 00:25:10.382 "name": "BaseBdev1", 00:25:10.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.382 "is_configured": false, 00:25:10.382 "data_offset": 0, 00:25:10.382 "data_size": 0 00:25:10.382 }, 00:25:10.382 { 00:25:10.382 "name": "BaseBdev2", 00:25:10.382 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:10.382 "is_configured": false, 00:25:10.382 "data_offset": 0, 00:25:10.382 "data_size": 0 00:25:10.382 } 00:25:10.382 ] 00:25:10.382 }' 00:25:10.382 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:10.382 16:04:15 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:10.949 16:04:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:11.208 [2024-06-10 16:04:16.638601] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:11.208 [2024-06-10 16:04:16.638628] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x234c120 name Existed_Raid, state configuring 00:25:11.208 16:04:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:11.466 [2024-06-10 16:04:16.899305] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:25:11.466 [2024-06-10 16:04:16.899332] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:25:11.466 [2024-06-10 16:04:16.899340] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:11.466 [2024-06-10 16:04:16.899349] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:11.466 16:04:16 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1 00:25:11.724 [2024-06-10 16:04:17.165314] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:11.724 BaseBdev1 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:11.724 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:11.982 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:25:12.241 [ 00:25:12.241 { 00:25:12.242 "name": "BaseBdev1", 00:25:12.242 "aliases": [ 00:25:12.242 "7a34d41f-c642-4901-93e6-d984d1d133dd" 00:25:12.242 ], 00:25:12.242 "product_name": "Malloc disk", 00:25:12.242 "block_size": 4096, 00:25:12.242 "num_blocks": 8192, 00:25:12.242 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:12.242 "assigned_rate_limits": { 00:25:12.242 "rw_ios_per_sec": 0, 00:25:12.242 "rw_mbytes_per_sec": 0, 00:25:12.242 "r_mbytes_per_sec": 0, 00:25:12.242 "w_mbytes_per_sec": 0 00:25:12.242 }, 00:25:12.242 "claimed": true, 00:25:12.242 "claim_type": "exclusive_write", 00:25:12.242 "zoned": false, 00:25:12.242 "supported_io_types": { 00:25:12.242 "read": true, 00:25:12.242 "write": true, 00:25:12.242 "unmap": true, 00:25:12.242 "write_zeroes": true, 00:25:12.242 "flush": true, 00:25:12.242 "reset": true, 00:25:12.242 "compare": false, 00:25:12.242 "compare_and_write": false, 00:25:12.242 "abort": true, 00:25:12.242 "nvme_admin": false, 00:25:12.242 "nvme_io": false 00:25:12.242 }, 00:25:12.242 "memory_domains": [ 00:25:12.242 { 00:25:12.242 "dma_device_id": "system", 00:25:12.242 "dma_device_type": 1 00:25:12.242 }, 00:25:12.242 { 00:25:12.242 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:12.242 "dma_device_type": 2 00:25:12.242 } 00:25:12.242 ], 00:25:12.242 "driver_specific": {} 00:25:12.242 } 00:25:12.242 ] 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:12.242 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:12.501 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:12.501 "name": "Existed_Raid", 00:25:12.501 "uuid": "b19364a7-d9d7-4064-951b-50532ba2b276", 00:25:12.501 "strip_size_kb": 0, 00:25:12.501 "state": "configuring", 00:25:12.501 "raid_level": "raid1", 00:25:12.501 "superblock": true, 00:25:12.501 "num_base_bdevs": 2, 00:25:12.501 "num_base_bdevs_discovered": 1, 00:25:12.501 "num_base_bdevs_operational": 2, 00:25:12.501 "base_bdevs_list": [ 00:25:12.501 { 00:25:12.501 "name": "BaseBdev1", 00:25:12.501 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:12.501 "is_configured": true, 00:25:12.501 "data_offset": 256, 00:25:12.501 "data_size": 7936 00:25:12.501 }, 00:25:12.501 { 00:25:12.501 "name": "BaseBdev2", 00:25:12.501 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:12.501 "is_configured": false, 00:25:12.501 "data_offset": 0, 00:25:12.501 "data_size": 0 00:25:12.501 } 00:25:12.501 ] 00:25:12.501 }' 00:25:12.501 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:12.501 16:04:17 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:13.068 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:25:13.327 [2024-06-10 16:04:18.625205] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:25:13.327 [2024-06-10 16:04:18.625239] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x234b9f0 name Existed_Raid, state configuring 00:25:13.327 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:25:13.586 [2024-06-10 16:04:18.881924] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:13.586 [2024-06-10 16:04:18.883536] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:25:13.586 [2024-06-10 16:04:18.883565] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:13.586 16:04:18 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:13.586 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:13.586 "name": "Existed_Raid", 00:25:13.586 "uuid": "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d", 00:25:13.586 "strip_size_kb": 0, 00:25:13.586 "state": "configuring", 00:25:13.586 "raid_level": "raid1", 00:25:13.586 "superblock": true, 00:25:13.586 "num_base_bdevs": 2, 00:25:13.586 "num_base_bdevs_discovered": 1, 00:25:13.586 "num_base_bdevs_operational": 2, 00:25:13.586 "base_bdevs_list": [ 00:25:13.586 { 00:25:13.586 "name": "BaseBdev1", 00:25:13.586 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:13.586 "is_configured": true, 00:25:13.586 "data_offset": 256, 00:25:13.586 "data_size": 7936 00:25:13.586 }, 00:25:13.586 { 00:25:13.586 "name": "BaseBdev2", 00:25:13.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:13.586 "is_configured": false, 00:25:13.586 "data_offset": 0, 00:25:13.586 "data_size": 0 00:25:13.586 } 00:25:13.586 ] 00:25:13.586 }' 00:25:13.586 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:13.586 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2 00:25:14.522 [2024-06-10 16:04:19.935878] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:14.522 [2024-06-10 16:04:19.936038] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x234c770 00:25:14.522 [2024-06-10 16:04:19.936051] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:14.522 [2024-06-10 16:04:19.936233] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x234deb0 00:25:14.522 [2024-06-10 16:04:19.936360] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x234c770 00:25:14.522 [2024-06-10 16:04:19.936369] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x234c770 00:25:14.522 [2024-06-10 16:04:19.936463] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:14.522 BaseBdev2 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@900 -- # local i 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:25:14.522 16:04:19 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:25:14.781 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:25:15.041 [ 00:25:15.041 { 00:25:15.041 "name": "BaseBdev2", 00:25:15.041 "aliases": [ 00:25:15.041 "07c63289-61fc-47c4-91e5-dd25f2a65ed7" 00:25:15.041 ], 00:25:15.041 "product_name": "Malloc disk", 00:25:15.041 "block_size": 4096, 00:25:15.041 "num_blocks": 8192, 00:25:15.041 "uuid": "07c63289-61fc-47c4-91e5-dd25f2a65ed7", 00:25:15.041 "assigned_rate_limits": { 00:25:15.041 "rw_ios_per_sec": 0, 00:25:15.041 "rw_mbytes_per_sec": 0, 00:25:15.041 "r_mbytes_per_sec": 0, 00:25:15.041 "w_mbytes_per_sec": 0 00:25:15.041 }, 00:25:15.041 "claimed": true, 00:25:15.041 "claim_type": "exclusive_write", 00:25:15.041 "zoned": false, 00:25:15.041 "supported_io_types": { 00:25:15.041 "read": true, 00:25:15.041 "write": true, 00:25:15.041 "unmap": true, 00:25:15.041 "write_zeroes": true, 00:25:15.041 "flush": true, 00:25:15.041 "reset": true, 00:25:15.041 "compare": false, 00:25:15.041 "compare_and_write": false, 00:25:15.041 "abort": true, 00:25:15.041 "nvme_admin": false, 00:25:15.041 "nvme_io": false 00:25:15.041 }, 00:25:15.041 "memory_domains": [ 00:25:15.041 { 00:25:15.041 "dma_device_id": "system", 00:25:15.041 "dma_device_type": 1 00:25:15.041 }, 00:25:15.041 { 00:25:15.041 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:15.041 "dma_device_type": 2 00:25:15.041 } 00:25:15.041 ], 00:25:15.041 "driver_specific": {} 00:25:15.041 } 00:25:15.041 ] 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@906 -- # return 0 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:15.041 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:15.300 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:15.300 "name": "Existed_Raid", 00:25:15.300 "uuid": "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d", 00:25:15.300 "strip_size_kb": 0, 00:25:15.300 "state": "online", 00:25:15.300 "raid_level": "raid1", 00:25:15.300 "superblock": true, 00:25:15.300 "num_base_bdevs": 2, 00:25:15.300 "num_base_bdevs_discovered": 2, 00:25:15.300 "num_base_bdevs_operational": 2, 00:25:15.300 "base_bdevs_list": [ 00:25:15.300 { 00:25:15.300 "name": "BaseBdev1", 00:25:15.300 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:15.300 "is_configured": true, 00:25:15.300 "data_offset": 256, 00:25:15.300 "data_size": 7936 00:25:15.300 }, 00:25:15.300 { 00:25:15.300 "name": "BaseBdev2", 00:25:15.300 "uuid": "07c63289-61fc-47c4-91e5-dd25f2a65ed7", 00:25:15.300 "is_configured": true, 00:25:15.300 "data_offset": 256, 00:25:15.300 "data_size": 7936 00:25:15.300 } 00:25:15.300 ] 00:25:15.300 }' 00:25:15.300 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:15.300 16:04:20 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:25:15.868 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:16.127 [2024-06-10 16:04:21.524390] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:16.127 "name": "Existed_Raid", 00:25:16.127 "aliases": [ 00:25:16.127 "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d" 00:25:16.127 ], 00:25:16.127 "product_name": "Raid Volume", 00:25:16.127 "block_size": 4096, 00:25:16.127 "num_blocks": 7936, 00:25:16.127 "uuid": "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d", 00:25:16.127 "assigned_rate_limits": { 00:25:16.127 "rw_ios_per_sec": 0, 00:25:16.127 "rw_mbytes_per_sec": 0, 00:25:16.127 "r_mbytes_per_sec": 0, 00:25:16.127 "w_mbytes_per_sec": 0 00:25:16.127 }, 00:25:16.127 "claimed": false, 00:25:16.127 "zoned": false, 00:25:16.127 "supported_io_types": { 00:25:16.127 "read": true, 00:25:16.127 "write": true, 00:25:16.127 "unmap": false, 00:25:16.127 "write_zeroes": true, 00:25:16.127 "flush": false, 00:25:16.127 "reset": true, 00:25:16.127 "compare": false, 00:25:16.127 "compare_and_write": false, 00:25:16.127 "abort": false, 00:25:16.127 "nvme_admin": false, 00:25:16.127 "nvme_io": false 00:25:16.127 }, 00:25:16.127 "memory_domains": [ 00:25:16.127 { 00:25:16.127 "dma_device_id": "system", 00:25:16.127 "dma_device_type": 1 00:25:16.127 }, 00:25:16.127 { 00:25:16.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:16.127 "dma_device_type": 2 00:25:16.127 }, 00:25:16.127 { 00:25:16.127 "dma_device_id": "system", 00:25:16.127 "dma_device_type": 1 00:25:16.127 }, 00:25:16.127 { 00:25:16.127 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:16.127 "dma_device_type": 2 00:25:16.127 } 00:25:16.127 ], 00:25:16.127 "driver_specific": { 00:25:16.127 "raid": { 00:25:16.127 "uuid": "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d", 00:25:16.127 "strip_size_kb": 0, 00:25:16.127 "state": "online", 00:25:16.127 "raid_level": "raid1", 00:25:16.127 "superblock": true, 00:25:16.127 "num_base_bdevs": 2, 00:25:16.127 "num_base_bdevs_discovered": 2, 00:25:16.127 "num_base_bdevs_operational": 2, 00:25:16.127 "base_bdevs_list": [ 00:25:16.127 { 00:25:16.127 "name": "BaseBdev1", 00:25:16.127 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:16.127 "is_configured": true, 00:25:16.127 "data_offset": 256, 00:25:16.127 "data_size": 7936 00:25:16.127 }, 00:25:16.127 { 00:25:16.127 "name": "BaseBdev2", 00:25:16.127 "uuid": "07c63289-61fc-47c4-91e5-dd25f2a65ed7", 00:25:16.127 "is_configured": true, 00:25:16.127 "data_offset": 256, 00:25:16.127 "data_size": 7936 00:25:16.127 } 00:25:16.127 ] 00:25:16.127 } 00:25:16.127 } 00:25:16.127 }' 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:25:16.127 BaseBdev2' 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:25:16.127 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:16.388 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:16.388 "name": "BaseBdev1", 00:25:16.388 "aliases": [ 00:25:16.388 "7a34d41f-c642-4901-93e6-d984d1d133dd" 00:25:16.388 ], 00:25:16.388 "product_name": "Malloc disk", 00:25:16.388 "block_size": 4096, 00:25:16.388 "num_blocks": 8192, 00:25:16.388 "uuid": "7a34d41f-c642-4901-93e6-d984d1d133dd", 00:25:16.388 "assigned_rate_limits": { 00:25:16.388 "rw_ios_per_sec": 0, 00:25:16.388 "rw_mbytes_per_sec": 0, 00:25:16.388 "r_mbytes_per_sec": 0, 00:25:16.388 "w_mbytes_per_sec": 0 00:25:16.388 }, 00:25:16.388 "claimed": true, 00:25:16.388 "claim_type": "exclusive_write", 00:25:16.388 "zoned": false, 00:25:16.388 "supported_io_types": { 00:25:16.388 "read": true, 00:25:16.388 "write": true, 00:25:16.388 "unmap": true, 00:25:16.388 "write_zeroes": true, 00:25:16.388 "flush": true, 00:25:16.388 "reset": true, 00:25:16.388 "compare": false, 00:25:16.388 "compare_and_write": false, 00:25:16.388 "abort": true, 00:25:16.388 "nvme_admin": false, 00:25:16.388 "nvme_io": false 00:25:16.388 }, 00:25:16.388 "memory_domains": [ 00:25:16.388 { 00:25:16.388 "dma_device_id": "system", 00:25:16.388 "dma_device_type": 1 00:25:16.388 }, 00:25:16.388 { 00:25:16.388 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:16.388 "dma_device_type": 2 00:25:16.388 } 00:25:16.388 ], 00:25:16.388 "driver_specific": {} 00:25:16.388 }' 00:25:16.388 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:16.388 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:16.647 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:16.647 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:16.647 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:16.647 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:16.648 16:04:21 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:25:16.648 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:16.906 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:16.906 "name": "BaseBdev2", 00:25:16.906 "aliases": [ 00:25:16.906 "07c63289-61fc-47c4-91e5-dd25f2a65ed7" 00:25:16.906 ], 00:25:16.906 "product_name": "Malloc disk", 00:25:16.906 "block_size": 4096, 00:25:16.906 "num_blocks": 8192, 00:25:16.906 "uuid": "07c63289-61fc-47c4-91e5-dd25f2a65ed7", 00:25:16.906 "assigned_rate_limits": { 00:25:16.906 "rw_ios_per_sec": 0, 00:25:16.906 "rw_mbytes_per_sec": 0, 00:25:16.906 "r_mbytes_per_sec": 0, 00:25:16.906 "w_mbytes_per_sec": 0 00:25:16.906 }, 00:25:16.906 "claimed": true, 00:25:16.906 "claim_type": "exclusive_write", 00:25:16.906 "zoned": false, 00:25:16.906 "supported_io_types": { 00:25:16.906 "read": true, 00:25:16.906 "write": true, 00:25:16.906 "unmap": true, 00:25:16.906 "write_zeroes": true, 00:25:16.906 "flush": true, 00:25:16.906 "reset": true, 00:25:16.906 "compare": false, 00:25:16.906 "compare_and_write": false, 00:25:16.906 "abort": true, 00:25:16.906 "nvme_admin": false, 00:25:16.906 "nvme_io": false 00:25:16.906 }, 00:25:16.906 "memory_domains": [ 00:25:16.906 { 00:25:16.906 "dma_device_id": "system", 00:25:16.906 "dma_device_type": 1 00:25:16.906 }, 00:25:16.906 { 00:25:16.906 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:16.906 "dma_device_type": 2 00:25:16.906 } 00:25:16.906 ], 00:25:16.906 "driver_specific": {} 00:25:16.906 }' 00:25:16.906 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:17.165 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:17.424 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:17.424 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:17.424 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:25:17.684 [2024-06-10 16:04:22.968039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@275 -- # local expected_state 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:17.684 16:04:22 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:25:17.943 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:17.943 "name": "Existed_Raid", 00:25:17.943 "uuid": "dc8c2b41-2080-42b4-aa20-e5df5c0acb4d", 00:25:17.943 "strip_size_kb": 0, 00:25:17.943 "state": "online", 00:25:17.943 "raid_level": "raid1", 00:25:17.943 "superblock": true, 00:25:17.943 "num_base_bdevs": 2, 00:25:17.943 "num_base_bdevs_discovered": 1, 00:25:17.943 "num_base_bdevs_operational": 1, 00:25:17.943 "base_bdevs_list": [ 00:25:17.943 { 00:25:17.943 "name": null, 00:25:17.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:17.943 "is_configured": false, 00:25:17.943 "data_offset": 256, 00:25:17.943 "data_size": 7936 00:25:17.943 }, 00:25:17.943 { 00:25:17.943 "name": "BaseBdev2", 00:25:17.943 "uuid": "07c63289-61fc-47c4-91e5-dd25f2a65ed7", 00:25:17.943 "is_configured": true, 00:25:17.943 "data_offset": 256, 00:25:17.943 "data_size": 7936 00:25:17.943 } 00:25:17.943 ] 00:25:17.943 }' 00:25:17.943 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:17.943 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:18.512 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:25:18.512 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:18.512 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:18.512 16:04:23 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:25:18.771 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:25:18.771 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:25:18.771 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:25:19.031 [2024-06-10 16:04:24.376930] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:25:19.031 [2024-06-10 16:04:24.377017] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:19.031 [2024-06-10 16:04:24.387839] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:19.031 [2024-06-10 16:04:24.387871] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:19.031 [2024-06-10 16:04:24.387880] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x234c770 name Existed_Raid, state offline 00:25:19.031 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:25:19.031 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:25:19.031 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:19.031 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@341 -- # killprocess 2802109 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 2802109 ']' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 2802109 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2802109 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2802109' 00:25:19.318 killing process with pid 2802109 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@968 -- # kill 2802109 00:25:19.318 [2024-06-10 16:04:24.711312] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:19.318 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@973 -- # wait 2802109 00:25:19.318 [2024-06-10 16:04:24.712159] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:19.577 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- bdev/bdev_raid.sh@343 -- # return 0 00:25:19.577 00:25:19.577 real 0m10.647s 00:25:19.577 user 0m19.287s 00:25:19.577 sys 0m1.633s 00:25:19.577 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:19.577 16:04:24 bdev_raid.raid_state_function_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:19.577 ************************************ 00:25:19.577 END TEST raid_state_function_test_sb_4k 00:25:19.577 ************************************ 00:25:19.577 16:04:24 bdev_raid -- bdev/bdev_raid.sh@899 -- # run_test raid_superblock_test_4k raid_superblock_test raid1 2 00:25:19.577 16:04:24 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:25:19.577 16:04:24 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:19.577 16:04:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:19.577 ************************************ 00:25:19.577 START TEST raid_superblock_test_4k 00:25:19.577 ************************************ 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@398 -- # local strip_size 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@411 -- # raid_pid=2803925 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@412 -- # waitforlisten 2803925 /var/tmp/spdk-raid.sock 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@830 -- # '[' -z 2803925 ']' 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:19.577 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:19.577 16:04:24 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:19.577 [2024-06-10 16:04:25.040198] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:25:19.577 [2024-06-10 16:04:25.040250] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2803925 ] 00:25:19.836 [2024-06-10 16:04:25.140421] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:19.836 [2024-06-10 16:04:25.237596] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:19.836 [2024-06-10 16:04:25.299414] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:19.836 [2024-06-10 16:04:25.299453] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@863 -- # return 0 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:20.772 16:04:25 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc1 00:25:20.772 malloc1 00:25:20.772 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:21.032 [2024-06-10 16:04:26.492759] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:21.032 [2024-06-10 16:04:26.492804] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.032 [2024-06-10 16:04:26.492823] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1ace0f0 00:25:21.032 [2024-06-10 16:04:26.492832] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.032 [2024-06-10 16:04:26.494549] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.032 [2024-06-10 16:04:26.494576] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:21.032 pt1 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:25:21.032 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b malloc2 00:25:21.291 malloc2 00:25:21.291 16:04:26 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:21.550 [2024-06-10 16:04:27.010953] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:21.550 [2024-06-10 16:04:27.011000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:21.550 [2024-06-10 16:04:27.011015] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acf400 00:25:21.550 [2024-06-10 16:04:27.011025] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:21.550 [2024-06-10 16:04:27.012577] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:21.550 [2024-06-10 16:04:27.012604] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:21.550 pt2 00:25:21.550 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:25:21.550 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:25:21.550 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:25:21.810 [2024-06-10 16:04:27.263625] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:21.810 [2024-06-10 16:04:27.264970] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:21.810 [2024-06-10 16:04:27.265121] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c7ae60 00:25:21.810 [2024-06-10 16:04:27.265133] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:21.810 [2024-06-10 16:04:27.265324] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1ae4fe0 00:25:21.810 [2024-06-10 16:04:27.265475] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c7ae60 00:25:21.810 [2024-06-10 16:04:27.265483] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c7ae60 00:25:21.810 [2024-06-10 16:04:27.265583] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:21.810 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:22.069 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:22.069 "name": "raid_bdev1", 00:25:22.069 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:22.069 "strip_size_kb": 0, 00:25:22.069 "state": "online", 00:25:22.069 "raid_level": "raid1", 00:25:22.069 "superblock": true, 00:25:22.069 "num_base_bdevs": 2, 00:25:22.069 "num_base_bdevs_discovered": 2, 00:25:22.069 "num_base_bdevs_operational": 2, 00:25:22.069 "base_bdevs_list": [ 00:25:22.069 { 00:25:22.069 "name": "pt1", 00:25:22.069 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:22.069 "is_configured": true, 00:25:22.069 "data_offset": 256, 00:25:22.069 "data_size": 7936 00:25:22.069 }, 00:25:22.069 { 00:25:22.069 "name": "pt2", 00:25:22.069 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:22.069 "is_configured": true, 00:25:22.069 "data_offset": 256, 00:25:22.069 "data_size": 7936 00:25:22.069 } 00:25:22.069 ] 00:25:22.069 }' 00:25:22.069 16:04:27 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:22.069 16:04:27 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:23.006 [2024-06-10 16:04:28.406902] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:23.006 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:23.006 "name": "raid_bdev1", 00:25:23.006 "aliases": [ 00:25:23.006 "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd" 00:25:23.006 ], 00:25:23.006 "product_name": "Raid Volume", 00:25:23.006 "block_size": 4096, 00:25:23.006 "num_blocks": 7936, 00:25:23.006 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:23.006 "assigned_rate_limits": { 00:25:23.006 "rw_ios_per_sec": 0, 00:25:23.006 "rw_mbytes_per_sec": 0, 00:25:23.006 "r_mbytes_per_sec": 0, 00:25:23.006 "w_mbytes_per_sec": 0 00:25:23.006 }, 00:25:23.006 "claimed": false, 00:25:23.006 "zoned": false, 00:25:23.006 "supported_io_types": { 00:25:23.006 "read": true, 00:25:23.006 "write": true, 00:25:23.006 "unmap": false, 00:25:23.006 "write_zeroes": true, 00:25:23.006 "flush": false, 00:25:23.006 "reset": true, 00:25:23.006 "compare": false, 00:25:23.006 "compare_and_write": false, 00:25:23.006 "abort": false, 00:25:23.006 "nvme_admin": false, 00:25:23.006 "nvme_io": false 00:25:23.006 }, 00:25:23.006 "memory_domains": [ 00:25:23.006 { 00:25:23.006 "dma_device_id": "system", 00:25:23.006 "dma_device_type": 1 00:25:23.006 }, 00:25:23.006 { 00:25:23.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.006 "dma_device_type": 2 00:25:23.006 }, 00:25:23.006 { 00:25:23.006 "dma_device_id": "system", 00:25:23.006 "dma_device_type": 1 00:25:23.006 }, 00:25:23.006 { 00:25:23.006 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.006 "dma_device_type": 2 00:25:23.006 } 00:25:23.006 ], 00:25:23.006 "driver_specific": { 00:25:23.006 "raid": { 00:25:23.006 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:23.006 "strip_size_kb": 0, 00:25:23.006 "state": "online", 00:25:23.006 "raid_level": "raid1", 00:25:23.006 "superblock": true, 00:25:23.006 "num_base_bdevs": 2, 00:25:23.006 "num_base_bdevs_discovered": 2, 00:25:23.006 "num_base_bdevs_operational": 2, 00:25:23.006 "base_bdevs_list": [ 00:25:23.006 { 00:25:23.006 "name": "pt1", 00:25:23.006 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:23.006 "is_configured": true, 00:25:23.006 "data_offset": 256, 00:25:23.006 "data_size": 7936 00:25:23.006 }, 00:25:23.006 { 00:25:23.006 "name": "pt2", 00:25:23.006 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:23.006 "is_configured": true, 00:25:23.006 "data_offset": 256, 00:25:23.006 "data_size": 7936 00:25:23.006 } 00:25:23.006 ] 00:25:23.006 } 00:25:23.006 } 00:25:23.006 }' 00:25:23.007 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:23.007 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:23.007 pt2' 00:25:23.007 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:23.007 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:23.007 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:23.266 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:23.266 "name": "pt1", 00:25:23.266 "aliases": [ 00:25:23.266 "00000000-0000-0000-0000-000000000001" 00:25:23.266 ], 00:25:23.266 "product_name": "passthru", 00:25:23.266 "block_size": 4096, 00:25:23.266 "num_blocks": 8192, 00:25:23.266 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:23.266 "assigned_rate_limits": { 00:25:23.266 "rw_ios_per_sec": 0, 00:25:23.266 "rw_mbytes_per_sec": 0, 00:25:23.266 "r_mbytes_per_sec": 0, 00:25:23.266 "w_mbytes_per_sec": 0 00:25:23.266 }, 00:25:23.266 "claimed": true, 00:25:23.266 "claim_type": "exclusive_write", 00:25:23.266 "zoned": false, 00:25:23.266 "supported_io_types": { 00:25:23.266 "read": true, 00:25:23.266 "write": true, 00:25:23.266 "unmap": true, 00:25:23.266 "write_zeroes": true, 00:25:23.266 "flush": true, 00:25:23.266 "reset": true, 00:25:23.266 "compare": false, 00:25:23.266 "compare_and_write": false, 00:25:23.266 "abort": true, 00:25:23.266 "nvme_admin": false, 00:25:23.266 "nvme_io": false 00:25:23.266 }, 00:25:23.266 "memory_domains": [ 00:25:23.266 { 00:25:23.266 "dma_device_id": "system", 00:25:23.266 "dma_device_type": 1 00:25:23.266 }, 00:25:23.266 { 00:25:23.266 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:23.266 "dma_device_type": 2 00:25:23.266 } 00:25:23.266 ], 00:25:23.266 "driver_specific": { 00:25:23.266 "passthru": { 00:25:23.266 "name": "pt1", 00:25:23.266 "base_bdev_name": "malloc1" 00:25:23.266 } 00:25:23.266 } 00:25:23.266 }' 00:25:23.266 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:23.525 16:04:28 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:23.525 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:23.785 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:23.785 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:23.785 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:23.785 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:24.044 "name": "pt2", 00:25:24.044 "aliases": [ 00:25:24.044 "00000000-0000-0000-0000-000000000002" 00:25:24.044 ], 00:25:24.044 "product_name": "passthru", 00:25:24.044 "block_size": 4096, 00:25:24.044 "num_blocks": 8192, 00:25:24.044 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:24.044 "assigned_rate_limits": { 00:25:24.044 "rw_ios_per_sec": 0, 00:25:24.044 "rw_mbytes_per_sec": 0, 00:25:24.044 "r_mbytes_per_sec": 0, 00:25:24.044 "w_mbytes_per_sec": 0 00:25:24.044 }, 00:25:24.044 "claimed": true, 00:25:24.044 "claim_type": "exclusive_write", 00:25:24.044 "zoned": false, 00:25:24.044 "supported_io_types": { 00:25:24.044 "read": true, 00:25:24.044 "write": true, 00:25:24.044 "unmap": true, 00:25:24.044 "write_zeroes": true, 00:25:24.044 "flush": true, 00:25:24.044 "reset": true, 00:25:24.044 "compare": false, 00:25:24.044 "compare_and_write": false, 00:25:24.044 "abort": true, 00:25:24.044 "nvme_admin": false, 00:25:24.044 "nvme_io": false 00:25:24.044 }, 00:25:24.044 "memory_domains": [ 00:25:24.044 { 00:25:24.044 "dma_device_id": "system", 00:25:24.044 "dma_device_type": 1 00:25:24.044 }, 00:25:24.044 { 00:25:24.044 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:24.044 "dma_device_type": 2 00:25:24.044 } 00:25:24.044 ], 00:25:24.044 "driver_specific": { 00:25:24.044 "passthru": { 00:25:24.044 "name": "pt2", 00:25:24.044 "base_bdev_name": "malloc2" 00:25:24.044 } 00:25:24.044 } 00:25:24.044 }' 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:24.044 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:24.304 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:25:24.562 [2024-06-10 16:04:29.939002] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:24.562 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd 00:25:24.562 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@435 -- # '[' -z c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd ']' 00:25:24.562 16:04:29 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:24.822 [2024-06-10 16:04:30.195466] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:24.822 [2024-06-10 16:04:30.195485] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:24.822 [2024-06-10 16:04:30.195536] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:24.822 [2024-06-10 16:04:30.195587] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:24.822 [2024-06-10 16:04:30.195596] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c7ae60 name raid_bdev1, state offline 00:25:24.822 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:24.822 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:25:25.081 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:25:25.081 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:25:25.081 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:25.081 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:25.340 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:25:25.340 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:25.599 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:25:25.599 16:04:30 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@649 -- # local es=0 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:25:25.858 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:25:26.117 [2024-06-10 16:04:31.466797] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:25:26.117 [2024-06-10 16:04:31.468224] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:25:26.117 [2024-06-10 16:04:31.468280] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:25:26.117 [2024-06-10 16:04:31.468316] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:25:26.117 [2024-06-10 16:04:31.468332] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:26.117 [2024-06-10 16:04:31.468340] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c78520 name raid_bdev1, state configuring 00:25:26.117 request: 00:25:26.117 { 00:25:26.117 "name": "raid_bdev1", 00:25:26.117 "raid_level": "raid1", 00:25:26.117 "base_bdevs": [ 00:25:26.117 "malloc1", 00:25:26.117 "malloc2" 00:25:26.117 ], 00:25:26.117 "superblock": false, 00:25:26.117 "method": "bdev_raid_create", 00:25:26.117 "req_id": 1 00:25:26.117 } 00:25:26.117 Got JSON-RPC error response 00:25:26.117 response: 00:25:26.117 { 00:25:26.117 "code": -17, 00:25:26.117 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:25:26.117 } 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@652 -- # es=1 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.117 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:25:26.379 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:25:26.379 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:25:26.380 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:26.639 [2024-06-10 16:04:31.976094] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:26.639 [2024-06-10 16:04:31.976134] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:26.639 [2024-06-10 16:04:31.976149] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c789b0 00:25:26.639 [2024-06-10 16:04:31.976158] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:26.639 [2024-06-10 16:04:31.977816] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:26.639 [2024-06-10 16:04:31.977843] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:26.639 [2024-06-10 16:04:31.977904] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:26.639 [2024-06-10 16:04:31.977928] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:26.639 pt1 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:26.639 16:04:31 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:26.897 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:26.897 "name": "raid_bdev1", 00:25:26.897 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:26.897 "strip_size_kb": 0, 00:25:26.897 "state": "configuring", 00:25:26.897 "raid_level": "raid1", 00:25:26.897 "superblock": true, 00:25:26.897 "num_base_bdevs": 2, 00:25:26.897 "num_base_bdevs_discovered": 1, 00:25:26.897 "num_base_bdevs_operational": 2, 00:25:26.897 "base_bdevs_list": [ 00:25:26.897 { 00:25:26.897 "name": "pt1", 00:25:26.897 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:26.897 "is_configured": true, 00:25:26.897 "data_offset": 256, 00:25:26.897 "data_size": 7936 00:25:26.897 }, 00:25:26.897 { 00:25:26.897 "name": null, 00:25:26.897 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:26.897 "is_configured": false, 00:25:26.897 "data_offset": 256, 00:25:26.897 "data_size": 7936 00:25:26.897 } 00:25:26.897 ] 00:25:26.897 }' 00:25:26.897 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:26.897 16:04:32 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:27.464 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:25:27.464 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:25:27.464 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:27.464 16:04:32 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:27.723 [2024-06-10 16:04:33.111144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:27.723 [2024-06-10 16:04:33.111191] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:27.723 [2024-06-10 16:04:33.111207] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1acd550 00:25:27.723 [2024-06-10 16:04:33.111216] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:27.723 [2024-06-10 16:04:33.111552] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:27.723 [2024-06-10 16:04:33.111568] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:27.723 [2024-06-10 16:04:33.111627] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:27.723 [2024-06-10 16:04:33.111644] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:27.723 [2024-06-10 16:04:33.111744] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c7f4a0 00:25:27.723 [2024-06-10 16:04:33.111753] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:27.723 [2024-06-10 16:04:33.111930] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1c78c80 00:25:27.723 [2024-06-10 16:04:33.112072] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c7f4a0 00:25:27.723 [2024-06-10 16:04:33.112081] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c7f4a0 00:25:27.723 [2024-06-10 16:04:33.112181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:27.723 pt2 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:27.723 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:27.982 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:27.982 "name": "raid_bdev1", 00:25:27.982 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:27.982 "strip_size_kb": 0, 00:25:27.982 "state": "online", 00:25:27.982 "raid_level": "raid1", 00:25:27.982 "superblock": true, 00:25:27.982 "num_base_bdevs": 2, 00:25:27.982 "num_base_bdevs_discovered": 2, 00:25:27.982 "num_base_bdevs_operational": 2, 00:25:27.982 "base_bdevs_list": [ 00:25:27.982 { 00:25:27.982 "name": "pt1", 00:25:27.982 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:27.982 "is_configured": true, 00:25:27.982 "data_offset": 256, 00:25:27.982 "data_size": 7936 00:25:27.982 }, 00:25:27.982 { 00:25:27.982 "name": "pt2", 00:25:27.982 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:27.982 "is_configured": true, 00:25:27.982 "data_offset": 256, 00:25:27.982 "data_size": 7936 00:25:27.982 } 00:25:27.982 ] 00:25:27.982 }' 00:25:27.982 16:04:33 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:27.982 16:04:33 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@198 -- # local name 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:28.548 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:25:28.807 [2024-06-10 16:04:34.234385] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:25:28.807 "name": "raid_bdev1", 00:25:28.807 "aliases": [ 00:25:28.807 "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd" 00:25:28.807 ], 00:25:28.807 "product_name": "Raid Volume", 00:25:28.807 "block_size": 4096, 00:25:28.807 "num_blocks": 7936, 00:25:28.807 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:28.807 "assigned_rate_limits": { 00:25:28.807 "rw_ios_per_sec": 0, 00:25:28.807 "rw_mbytes_per_sec": 0, 00:25:28.807 "r_mbytes_per_sec": 0, 00:25:28.807 "w_mbytes_per_sec": 0 00:25:28.807 }, 00:25:28.807 "claimed": false, 00:25:28.807 "zoned": false, 00:25:28.807 "supported_io_types": { 00:25:28.807 "read": true, 00:25:28.807 "write": true, 00:25:28.807 "unmap": false, 00:25:28.807 "write_zeroes": true, 00:25:28.807 "flush": false, 00:25:28.807 "reset": true, 00:25:28.807 "compare": false, 00:25:28.807 "compare_and_write": false, 00:25:28.807 "abort": false, 00:25:28.807 "nvme_admin": false, 00:25:28.807 "nvme_io": false 00:25:28.807 }, 00:25:28.807 "memory_domains": [ 00:25:28.807 { 00:25:28.807 "dma_device_id": "system", 00:25:28.807 "dma_device_type": 1 00:25:28.807 }, 00:25:28.807 { 00:25:28.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:28.807 "dma_device_type": 2 00:25:28.807 }, 00:25:28.807 { 00:25:28.807 "dma_device_id": "system", 00:25:28.807 "dma_device_type": 1 00:25:28.807 }, 00:25:28.807 { 00:25:28.807 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:28.807 "dma_device_type": 2 00:25:28.807 } 00:25:28.807 ], 00:25:28.807 "driver_specific": { 00:25:28.807 "raid": { 00:25:28.807 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:28.807 "strip_size_kb": 0, 00:25:28.807 "state": "online", 00:25:28.807 "raid_level": "raid1", 00:25:28.807 "superblock": true, 00:25:28.807 "num_base_bdevs": 2, 00:25:28.807 "num_base_bdevs_discovered": 2, 00:25:28.807 "num_base_bdevs_operational": 2, 00:25:28.807 "base_bdevs_list": [ 00:25:28.807 { 00:25:28.807 "name": "pt1", 00:25:28.807 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:28.807 "is_configured": true, 00:25:28.807 "data_offset": 256, 00:25:28.807 "data_size": 7936 00:25:28.807 }, 00:25:28.807 { 00:25:28.807 "name": "pt2", 00:25:28.807 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:28.807 "is_configured": true, 00:25:28.807 "data_offset": 256, 00:25:28.807 "data_size": 7936 00:25:28.807 } 00:25:28.807 ] 00:25:28.807 } 00:25:28.807 } 00:25:28.807 }' 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:25:28.807 pt2' 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:25:28.807 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:29.066 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:29.066 "name": "pt1", 00:25:29.066 "aliases": [ 00:25:29.066 "00000000-0000-0000-0000-000000000001" 00:25:29.066 ], 00:25:29.066 "product_name": "passthru", 00:25:29.066 "block_size": 4096, 00:25:29.066 "num_blocks": 8192, 00:25:29.066 "uuid": "00000000-0000-0000-0000-000000000001", 00:25:29.066 "assigned_rate_limits": { 00:25:29.066 "rw_ios_per_sec": 0, 00:25:29.066 "rw_mbytes_per_sec": 0, 00:25:29.066 "r_mbytes_per_sec": 0, 00:25:29.066 "w_mbytes_per_sec": 0 00:25:29.066 }, 00:25:29.066 "claimed": true, 00:25:29.066 "claim_type": "exclusive_write", 00:25:29.066 "zoned": false, 00:25:29.066 "supported_io_types": { 00:25:29.066 "read": true, 00:25:29.066 "write": true, 00:25:29.066 "unmap": true, 00:25:29.066 "write_zeroes": true, 00:25:29.066 "flush": true, 00:25:29.066 "reset": true, 00:25:29.066 "compare": false, 00:25:29.066 "compare_and_write": false, 00:25:29.066 "abort": true, 00:25:29.066 "nvme_admin": false, 00:25:29.066 "nvme_io": false 00:25:29.066 }, 00:25:29.066 "memory_domains": [ 00:25:29.066 { 00:25:29.066 "dma_device_id": "system", 00:25:29.066 "dma_device_type": 1 00:25:29.066 }, 00:25:29.066 { 00:25:29.066 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:29.066 "dma_device_type": 2 00:25:29.066 } 00:25:29.066 ], 00:25:29.066 "driver_specific": { 00:25:29.066 "passthru": { 00:25:29.066 "name": "pt1", 00:25:29.066 "base_bdev_name": "malloc1" 00:25:29.066 } 00:25:29.066 } 00:25:29.066 }' 00:25:29.066 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:29.325 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:25:29.584 16:04:34 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:25:29.843 "name": "pt2", 00:25:29.843 "aliases": [ 00:25:29.843 "00000000-0000-0000-0000-000000000002" 00:25:29.843 ], 00:25:29.843 "product_name": "passthru", 00:25:29.843 "block_size": 4096, 00:25:29.843 "num_blocks": 8192, 00:25:29.843 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:29.843 "assigned_rate_limits": { 00:25:29.843 "rw_ios_per_sec": 0, 00:25:29.843 "rw_mbytes_per_sec": 0, 00:25:29.843 "r_mbytes_per_sec": 0, 00:25:29.843 "w_mbytes_per_sec": 0 00:25:29.843 }, 00:25:29.843 "claimed": true, 00:25:29.843 "claim_type": "exclusive_write", 00:25:29.843 "zoned": false, 00:25:29.843 "supported_io_types": { 00:25:29.843 "read": true, 00:25:29.843 "write": true, 00:25:29.843 "unmap": true, 00:25:29.843 "write_zeroes": true, 00:25:29.843 "flush": true, 00:25:29.843 "reset": true, 00:25:29.843 "compare": false, 00:25:29.843 "compare_and_write": false, 00:25:29.843 "abort": true, 00:25:29.843 "nvme_admin": false, 00:25:29.843 "nvme_io": false 00:25:29.843 }, 00:25:29.843 "memory_domains": [ 00:25:29.843 { 00:25:29.843 "dma_device_id": "system", 00:25:29.843 "dma_device_type": 1 00:25:29.843 }, 00:25:29.843 { 00:25:29.843 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:25:29.843 "dma_device_type": 2 00:25:29.843 } 00:25:29.843 ], 00:25:29.843 "driver_specific": { 00:25:29.843 "passthru": { 00:25:29.843 "name": "pt2", 00:25:29.843 "base_bdev_name": "malloc2" 00:25:29.843 } 00:25:29.843 } 00:25:29.843 }' 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:29.843 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@206 -- # [[ null == null ]] 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@207 -- # [[ null == null ]] 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@208 -- # [[ null == null ]] 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:30.104 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:25:30.363 [2024-06-10 16:04:35.790547] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:30.363 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@486 -- # '[' c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd '!=' c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd ']' 00:25:30.364 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:25:30.364 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@213 -- # case $1 in 00:25:30.364 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@214 -- # return 0 00:25:30.364 16:04:35 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:25:30.650 [2024-06-10 16:04:36.047039] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:30.650 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:30.909 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:30.909 "name": "raid_bdev1", 00:25:30.909 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:30.909 "strip_size_kb": 0, 00:25:30.909 "state": "online", 00:25:30.909 "raid_level": "raid1", 00:25:30.909 "superblock": true, 00:25:30.909 "num_base_bdevs": 2, 00:25:30.909 "num_base_bdevs_discovered": 1, 00:25:30.909 "num_base_bdevs_operational": 1, 00:25:30.909 "base_bdevs_list": [ 00:25:30.909 { 00:25:30.909 "name": null, 00:25:30.909 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:30.909 "is_configured": false, 00:25:30.909 "data_offset": 256, 00:25:30.909 "data_size": 7936 00:25:30.909 }, 00:25:30.909 { 00:25:30.909 "name": "pt2", 00:25:30.909 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:30.909 "is_configured": true, 00:25:30.909 "data_offset": 256, 00:25:30.909 "data_size": 7936 00:25:30.909 } 00:25:30.909 ] 00:25:30.909 }' 00:25:30.909 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:30.909 16:04:36 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:31.477 16:04:36 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:31.736 [2024-06-10 16:04:37.190163] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:31.736 [2024-06-10 16:04:37.190187] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:31.736 [2024-06-10 16:04:37.190240] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:31.736 [2024-06-10 16:04:37.190281] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:31.736 [2024-06-10 16:04:37.190290] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c7f4a0 name raid_bdev1, state offline 00:25:31.736 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:31.736 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:25:31.994 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:25:31.994 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:25:31.994 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:25:31.994 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:31.994 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@518 -- # i=1 00:25:32.268 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:25:32.535 [2024-06-10 16:04:37.948144] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:25:32.535 [2024-06-10 16:04:37.948184] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:32.535 [2024-06-10 16:04:37.948200] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c7f250 00:25:32.535 [2024-06-10 16:04:37.948210] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:32.535 [2024-06-10 16:04:37.949885] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:32.535 [2024-06-10 16:04:37.949912] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:25:32.535 [2024-06-10 16:04:37.949984] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:25:32.535 [2024-06-10 16:04:37.950009] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:32.536 [2024-06-10 16:04:37.950093] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1accb00 00:25:32.536 [2024-06-10 16:04:37.950102] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:32.536 [2024-06-10 16:04:37.950289] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1acdcf0 00:25:32.536 [2024-06-10 16:04:37.950415] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1accb00 00:25:32.536 [2024-06-10 16:04:37.950424] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1accb00 00:25:32.536 [2024-06-10 16:04:37.950523] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:32.536 pt2 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:32.536 16:04:37 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:32.794 16:04:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:32.794 "name": "raid_bdev1", 00:25:32.794 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:32.794 "strip_size_kb": 0, 00:25:32.794 "state": "online", 00:25:32.794 "raid_level": "raid1", 00:25:32.794 "superblock": true, 00:25:32.794 "num_base_bdevs": 2, 00:25:32.794 "num_base_bdevs_discovered": 1, 00:25:32.794 "num_base_bdevs_operational": 1, 00:25:32.794 "base_bdevs_list": [ 00:25:32.794 { 00:25:32.794 "name": null, 00:25:32.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:32.794 "is_configured": false, 00:25:32.794 "data_offset": 256, 00:25:32.794 "data_size": 7936 00:25:32.794 }, 00:25:32.794 { 00:25:32.794 "name": "pt2", 00:25:32.794 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:32.794 "is_configured": true, 00:25:32.794 "data_offset": 256, 00:25:32.794 "data_size": 7936 00:25:32.794 } 00:25:32.794 ] 00:25:32.794 }' 00:25:32.794 16:04:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:32.794 16:04:38 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:33.361 16:04:38 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:33.619 [2024-06-10 16:04:39.087181] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:33.619 [2024-06-10 16:04:39.087205] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:33.619 [2024-06-10 16:04:39.087254] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:33.619 [2024-06-10 16:04:39.087295] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:33.619 [2024-06-10 16:04:39.087304] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1accb00 name raid_bdev1, state offline 00:25:33.619 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:33.619 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:25:33.878 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:25:33.878 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:25:33.878 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:25:33.878 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:25:34.137 [2024-06-10 16:04:39.608557] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:25:34.137 [2024-06-10 16:04:39.608598] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:34.137 [2024-06-10 16:04:39.608612] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1c773a0 00:25:34.137 [2024-06-10 16:04:39.608621] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:34.137 [2024-06-10 16:04:39.610300] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:34.137 [2024-06-10 16:04:39.610325] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:25:34.137 [2024-06-10 16:04:39.610388] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:25:34.137 [2024-06-10 16:04:39.610411] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:25:34.137 [2024-06-10 16:04:39.610512] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:25:34.137 [2024-06-10 16:04:39.610522] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:34.137 [2024-06-10 16:04:39.610534] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c7b8e0 name raid_bdev1, state configuring 00:25:34.137 [2024-06-10 16:04:39.610554] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:25:34.137 [2024-06-10 16:04:39.610611] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1c7ec40 00:25:34.137 [2024-06-10 16:04:39.610620] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:34.137 [2024-06-10 16:04:39.610790] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1acec70 00:25:34.137 [2024-06-10 16:04:39.610915] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1c7ec40 00:25:34.137 [2024-06-10 16:04:39.610922] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1c7ec40 00:25:34.137 [2024-06-10 16:04:39.611034] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:34.137 pt1 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:34.137 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:34.397 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:34.397 "name": "raid_bdev1", 00:25:34.397 "uuid": "c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd", 00:25:34.397 "strip_size_kb": 0, 00:25:34.397 "state": "online", 00:25:34.397 "raid_level": "raid1", 00:25:34.397 "superblock": true, 00:25:34.397 "num_base_bdevs": 2, 00:25:34.397 "num_base_bdevs_discovered": 1, 00:25:34.397 "num_base_bdevs_operational": 1, 00:25:34.397 "base_bdevs_list": [ 00:25:34.397 { 00:25:34.397 "name": null, 00:25:34.397 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:34.397 "is_configured": false, 00:25:34.397 "data_offset": 256, 00:25:34.397 "data_size": 7936 00:25:34.397 }, 00:25:34.397 { 00:25:34.397 "name": "pt2", 00:25:34.397 "uuid": "00000000-0000-0000-0000-000000000002", 00:25:34.397 "is_configured": true, 00:25:34.397 "data_offset": 256, 00:25:34.397 "data_size": 7936 00:25:34.397 } 00:25:34.397 ] 00:25:34.397 }' 00:25:34.397 16:04:39 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:34.397 16:04:39 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:35.333 16:04:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:25:35.333 16:04:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:25:35.333 16:04:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:25:35.333 16:04:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:35.333 16:04:40 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:25:35.591 [2024-06-10 16:04:41.020539] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@557 -- # '[' c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd '!=' c12cab15-7cdd-408e-8cd6-4f1fe94c7cdd ']' 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@562 -- # killprocess 2803925 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@949 -- # '[' -z 2803925 ']' 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@953 -- # kill -0 2803925 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # uname 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2803925 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2803925' 00:25:35.591 killing process with pid 2803925 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@968 -- # kill 2803925 00:25:35.591 [2024-06-10 16:04:41.086290] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:25:35.591 [2024-06-10 16:04:41.086342] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:35.591 [2024-06-10 16:04:41.086381] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:35.591 [2024-06-10 16:04:41.086389] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1c7ec40 name raid_bdev1, state offline 00:25:35.591 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@973 -- # wait 2803925 00:25:35.850 [2024-06-10 16:04:41.102924] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:25:35.850 16:04:41 bdev_raid.raid_superblock_test_4k -- bdev/bdev_raid.sh@564 -- # return 0 00:25:35.850 00:25:35.850 real 0m16.318s 00:25:35.850 user 0m30.295s 00:25:35.850 sys 0m2.384s 00:25:35.850 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:25:35.850 16:04:41 bdev_raid.raid_superblock_test_4k -- common/autotest_common.sh@10 -- # set +x 00:25:35.850 ************************************ 00:25:35.850 END TEST raid_superblock_test_4k 00:25:35.850 ************************************ 00:25:35.850 16:04:41 bdev_raid -- bdev/bdev_raid.sh@900 -- # '[' true = true ']' 00:25:35.850 16:04:41 bdev_raid -- bdev/bdev_raid.sh@901 -- # run_test raid_rebuild_test_sb_4k raid_rebuild_test raid1 2 true false true 00:25:35.850 16:04:41 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:25:35.850 16:04:41 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:25:35.850 16:04:41 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:25:36.109 ************************************ 00:25:36.110 START TEST raid_rebuild_test_sb_4k 00:25:36.110 ************************************ 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@572 -- # local verify=true 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@575 -- # local strip_size 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@576 -- # local create_arg 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@578 -- # local data_offset 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@596 -- # raid_pid=2806949 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@597 -- # waitforlisten 2806949 /var/tmp/spdk-raid.sock 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@830 -- # '[' -z 2806949 ']' 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@835 -- # local max_retries=100 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:25:36.110 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@839 -- # xtrace_disable 00:25:36.110 16:04:41 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:36.110 [2024-06-10 16:04:41.439174] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:25:36.110 [2024-06-10 16:04:41.439229] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2806949 ] 00:25:36.110 I/O size of 3145728 is greater than zero copy threshold (65536). 00:25:36.110 Zero copy mechanism will not be used. 00:25:36.110 [2024-06-10 16:04:41.539271] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:25:36.368 [2024-06-10 16:04:41.633893] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:25:36.368 [2024-06-10 16:04:41.697059] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:36.368 [2024-06-10 16:04:41.697100] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:25:36.934 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:25:36.934 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@863 -- # return 0 00:25:36.934 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:36.934 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev1_malloc 00:25:37.192 BaseBdev1_malloc 00:25:37.192 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:25:37.463 [2024-06-10 16:04:42.887009] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:25:37.463 [2024-06-10 16:04:42.887052] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.463 [2024-06-10 16:04:42.887075] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe98e90 00:25:37.463 [2024-06-10 16:04:42.887085] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.463 [2024-06-10 16:04:42.888789] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.463 [2024-06-10 16:04:42.888816] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:25:37.463 BaseBdev1 00:25:37.463 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:25:37.463 16:04:42 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b BaseBdev2_malloc 00:25:37.720 BaseBdev2_malloc 00:25:37.720 16:04:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:25:37.978 [2024-06-10 16:04:43.401079] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:25:37.978 [2024-06-10 16:04:43.401120] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:37.978 [2024-06-10 16:04:43.401143] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe999e0 00:25:37.978 [2024-06-10 16:04:43.401153] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:37.979 [2024-06-10 16:04:43.402701] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:37.979 [2024-06-10 16:04:43.402727] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:25:37.979 BaseBdev2 00:25:37.979 16:04:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -b spare_malloc 00:25:38.237 spare_malloc 00:25:38.237 16:04:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:25:38.496 spare_delay 00:25:38.496 16:04:43 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:38.754 [2024-06-10 16:04:44.151725] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:38.754 [2024-06-10 16:04:44.151766] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:38.754 [2024-06-10 16:04:44.151785] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1047890 00:25:38.754 [2024-06-10 16:04:44.151795] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:38.754 [2024-06-10 16:04:44.153388] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:38.754 [2024-06-10 16:04:44.153415] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:38.754 spare 00:25:38.755 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:25:39.013 [2024-06-10 16:04:44.404415] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:25:39.013 [2024-06-10 16:04:44.405740] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:39.013 [2024-06-10 16:04:44.405911] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1046da0 00:25:39.013 [2024-06-10 16:04:44.405924] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:39.013 [2024-06-10 16:04:44.406134] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe97970 00:25:39.013 [2024-06-10 16:04:44.406280] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1046da0 00:25:39.013 [2024-06-10 16:04:44.406289] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1046da0 00:25:39.013 [2024-06-10 16:04:44.406386] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:39.013 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:39.271 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:39.271 "name": "raid_bdev1", 00:25:39.271 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:39.271 "strip_size_kb": 0, 00:25:39.271 "state": "online", 00:25:39.271 "raid_level": "raid1", 00:25:39.271 "superblock": true, 00:25:39.271 "num_base_bdevs": 2, 00:25:39.271 "num_base_bdevs_discovered": 2, 00:25:39.271 "num_base_bdevs_operational": 2, 00:25:39.271 "base_bdevs_list": [ 00:25:39.271 { 00:25:39.271 "name": "BaseBdev1", 00:25:39.271 "uuid": "dd20f5ea-ae47-5ab4-a365-1ea2990aee89", 00:25:39.271 "is_configured": true, 00:25:39.271 "data_offset": 256, 00:25:39.271 "data_size": 7936 00:25:39.271 }, 00:25:39.271 { 00:25:39.271 "name": "BaseBdev2", 00:25:39.271 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:39.271 "is_configured": true, 00:25:39.271 "data_offset": 256, 00:25:39.271 "data_size": 7936 00:25:39.271 } 00:25:39.271 ] 00:25:39.271 }' 00:25:39.271 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:39.271 16:04:44 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:39.838 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:25:39.838 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:25:40.097 [2024-06-10 16:04:45.551700] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:25:40.097 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:25:40.097 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:40.097 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:40.355 16:04:45 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:25:40.614 [2024-06-10 16:04:46.072896] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1049930 00:25:40.614 /dev/nbd0 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:40.614 1+0 records in 00:25:40.614 1+0 records out 00:25:40.614 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000230262 s, 17.8 MB/s 00:25:40.614 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:25:40.873 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:25:41.442 7936+0 records in 00:25:41.442 7936+0 records out 00:25:41.442 32505856 bytes (33 MB, 31 MiB) copied, 0.724177 s, 44.9 MB/s 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:41.442 16:04:46 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:41.701 [2024-06-10 16:04:47.133600] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:41.701 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:25:41.960 [2024-06-10 16:04:47.378300] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:41.960 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:42.219 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:42.219 "name": "raid_bdev1", 00:25:42.219 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:42.219 "strip_size_kb": 0, 00:25:42.219 "state": "online", 00:25:42.219 "raid_level": "raid1", 00:25:42.219 "superblock": true, 00:25:42.219 "num_base_bdevs": 2, 00:25:42.219 "num_base_bdevs_discovered": 1, 00:25:42.219 "num_base_bdevs_operational": 1, 00:25:42.219 "base_bdevs_list": [ 00:25:42.219 { 00:25:42.219 "name": null, 00:25:42.219 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:42.219 "is_configured": false, 00:25:42.219 "data_offset": 256, 00:25:42.219 "data_size": 7936 00:25:42.219 }, 00:25:42.219 { 00:25:42.219 "name": "BaseBdev2", 00:25:42.219 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:42.219 "is_configured": true, 00:25:42.219 "data_offset": 256, 00:25:42.219 "data_size": 7936 00:25:42.219 } 00:25:42.219 ] 00:25:42.219 }' 00:25:42.219 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:42.219 16:04:47 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:42.787 16:04:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:43.046 [2024-06-10 16:04:48.521379] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:43.046 [2024-06-10 16:04:48.526235] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe98570 00:25:43.046 [2024-06-10 16:04:48.528298] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:43.046 16:04:48 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@646 -- # sleep 1 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:44.424 "name": "raid_bdev1", 00:25:44.424 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:44.424 "strip_size_kb": 0, 00:25:44.424 "state": "online", 00:25:44.424 "raid_level": "raid1", 00:25:44.424 "superblock": true, 00:25:44.424 "num_base_bdevs": 2, 00:25:44.424 "num_base_bdevs_discovered": 2, 00:25:44.424 "num_base_bdevs_operational": 2, 00:25:44.424 "process": { 00:25:44.424 "type": "rebuild", 00:25:44.424 "target": "spare", 00:25:44.424 "progress": { 00:25:44.424 "blocks": 3072, 00:25:44.424 "percent": 38 00:25:44.424 } 00:25:44.424 }, 00:25:44.424 "base_bdevs_list": [ 00:25:44.424 { 00:25:44.424 "name": "spare", 00:25:44.424 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:44.424 "is_configured": true, 00:25:44.424 "data_offset": 256, 00:25:44.424 "data_size": 7936 00:25:44.424 }, 00:25:44.424 { 00:25:44.424 "name": "BaseBdev2", 00:25:44.424 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:44.424 "is_configured": true, 00:25:44.424 "data_offset": 256, 00:25:44.424 "data_size": 7936 00:25:44.424 } 00:25:44.424 ] 00:25:44.424 }' 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:44.424 16:04:49 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:44.683 [2024-06-10 16:04:50.131429] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:44.683 [2024-06-10 16:04:50.140776] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:44.683 [2024-06-10 16:04:50.140818] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:44.683 [2024-06-10 16:04:50.140833] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:44.683 [2024-06-10 16:04:50.140840] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:44.683 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:44.942 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:44.942 "name": "raid_bdev1", 00:25:44.942 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:44.942 "strip_size_kb": 0, 00:25:44.942 "state": "online", 00:25:44.942 "raid_level": "raid1", 00:25:44.942 "superblock": true, 00:25:44.942 "num_base_bdevs": 2, 00:25:44.942 "num_base_bdevs_discovered": 1, 00:25:44.942 "num_base_bdevs_operational": 1, 00:25:44.942 "base_bdevs_list": [ 00:25:44.942 { 00:25:44.942 "name": null, 00:25:44.942 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:44.942 "is_configured": false, 00:25:44.942 "data_offset": 256, 00:25:44.942 "data_size": 7936 00:25:44.942 }, 00:25:44.942 { 00:25:44.942 "name": "BaseBdev2", 00:25:44.942 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:44.942 "is_configured": true, 00:25:44.942 "data_offset": 256, 00:25:44.942 "data_size": 7936 00:25:44.942 } 00:25:44.942 ] 00:25:44.942 }' 00:25:44.942 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:44.942 16:04:50 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:45.548 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:45.806 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:45.807 "name": "raid_bdev1", 00:25:45.807 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:45.807 "strip_size_kb": 0, 00:25:45.807 "state": "online", 00:25:45.807 "raid_level": "raid1", 00:25:45.807 "superblock": true, 00:25:45.807 "num_base_bdevs": 2, 00:25:45.807 "num_base_bdevs_discovered": 1, 00:25:45.807 "num_base_bdevs_operational": 1, 00:25:45.807 "base_bdevs_list": [ 00:25:45.807 { 00:25:45.807 "name": null, 00:25:45.807 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:45.807 "is_configured": false, 00:25:45.807 "data_offset": 256, 00:25:45.807 "data_size": 7936 00:25:45.807 }, 00:25:45.807 { 00:25:45.807 "name": "BaseBdev2", 00:25:45.807 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:45.807 "is_configured": true, 00:25:45.807 "data_offset": 256, 00:25:45.807 "data_size": 7936 00:25:45.807 } 00:25:45.807 ] 00:25:45.807 }' 00:25:45.807 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:46.066 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:46.066 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:46.066 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:46.066 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:46.325 [2024-06-10 16:04:51.637272] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:46.325 [2024-06-10 16:04:51.642132] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1047eb0 00:25:46.325 [2024-06-10 16:04:51.643651] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:46.325 16:04:51 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@662 -- # sleep 1 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.260 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.519 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:47.519 "name": "raid_bdev1", 00:25:47.519 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:47.519 "strip_size_kb": 0, 00:25:47.519 "state": "online", 00:25:47.519 "raid_level": "raid1", 00:25:47.519 "superblock": true, 00:25:47.519 "num_base_bdevs": 2, 00:25:47.519 "num_base_bdevs_discovered": 2, 00:25:47.519 "num_base_bdevs_operational": 2, 00:25:47.519 "process": { 00:25:47.519 "type": "rebuild", 00:25:47.519 "target": "spare", 00:25:47.519 "progress": { 00:25:47.519 "blocks": 3072, 00:25:47.519 "percent": 38 00:25:47.519 } 00:25:47.519 }, 00:25:47.519 "base_bdevs_list": [ 00:25:47.519 { 00:25:47.519 "name": "spare", 00:25:47.519 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:47.519 "is_configured": true, 00:25:47.519 "data_offset": 256, 00:25:47.519 "data_size": 7936 00:25:47.519 }, 00:25:47.519 { 00:25:47.519 "name": "BaseBdev2", 00:25:47.519 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:47.519 "is_configured": true, 00:25:47.519 "data_offset": 256, 00:25:47.519 "data_size": 7936 00:25:47.519 } 00:25:47.519 ] 00:25:47.519 }' 00:25:47.519 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:47.519 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:47.519 16:04:52 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:25:47.519 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@705 -- # local timeout=1015 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:47.519 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:47.777 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:47.777 "name": "raid_bdev1", 00:25:47.777 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:47.777 "strip_size_kb": 0, 00:25:47.777 "state": "online", 00:25:47.777 "raid_level": "raid1", 00:25:47.777 "superblock": true, 00:25:47.777 "num_base_bdevs": 2, 00:25:47.777 "num_base_bdevs_discovered": 2, 00:25:47.777 "num_base_bdevs_operational": 2, 00:25:47.777 "process": { 00:25:47.777 "type": "rebuild", 00:25:47.777 "target": "spare", 00:25:47.777 "progress": { 00:25:47.777 "blocks": 4096, 00:25:47.777 "percent": 51 00:25:47.777 } 00:25:47.777 }, 00:25:47.777 "base_bdevs_list": [ 00:25:47.777 { 00:25:47.777 "name": "spare", 00:25:47.777 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:47.777 "is_configured": true, 00:25:47.777 "data_offset": 256, 00:25:47.777 "data_size": 7936 00:25:47.777 }, 00:25:47.777 { 00:25:47.777 "name": "BaseBdev2", 00:25:47.777 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:47.777 "is_configured": true, 00:25:47.777 "data_offset": 256, 00:25:47.777 "data_size": 7936 00:25:47.777 } 00:25:47.777 ] 00:25:47.777 }' 00:25:47.777 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:48.036 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:48.036 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:48.036 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:48.036 16:04:53 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:48.972 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:49.230 "name": "raid_bdev1", 00:25:49.230 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:49.230 "strip_size_kb": 0, 00:25:49.230 "state": "online", 00:25:49.230 "raid_level": "raid1", 00:25:49.230 "superblock": true, 00:25:49.230 "num_base_bdevs": 2, 00:25:49.230 "num_base_bdevs_discovered": 2, 00:25:49.230 "num_base_bdevs_operational": 2, 00:25:49.230 "process": { 00:25:49.230 "type": "rebuild", 00:25:49.230 "target": "spare", 00:25:49.230 "progress": { 00:25:49.230 "blocks": 7424, 00:25:49.230 "percent": 93 00:25:49.230 } 00:25:49.230 }, 00:25:49.230 "base_bdevs_list": [ 00:25:49.230 { 00:25:49.230 "name": "spare", 00:25:49.230 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:49.230 "is_configured": true, 00:25:49.230 "data_offset": 256, 00:25:49.230 "data_size": 7936 00:25:49.230 }, 00:25:49.230 { 00:25:49.230 "name": "BaseBdev2", 00:25:49.230 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:49.230 "is_configured": true, 00:25:49.230 "data_offset": 256, 00:25:49.230 "data_size": 7936 00:25:49.230 } 00:25:49.230 ] 00:25:49.230 }' 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:49.230 16:04:54 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@710 -- # sleep 1 00:25:49.489 [2024-06-10 16:04:54.767098] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:25:49.489 [2024-06-10 16:04:54.767154] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:25:49.489 [2024-06-10 16:04:54.767231] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.425 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.684 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.684 "name": "raid_bdev1", 00:25:50.684 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:50.684 "strip_size_kb": 0, 00:25:50.684 "state": "online", 00:25:50.684 "raid_level": "raid1", 00:25:50.684 "superblock": true, 00:25:50.684 "num_base_bdevs": 2, 00:25:50.684 "num_base_bdevs_discovered": 2, 00:25:50.684 "num_base_bdevs_operational": 2, 00:25:50.684 "base_bdevs_list": [ 00:25:50.684 { 00:25:50.684 "name": "spare", 00:25:50.684 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:50.684 "is_configured": true, 00:25:50.684 "data_offset": 256, 00:25:50.684 "data_size": 7936 00:25:50.684 }, 00:25:50.684 { 00:25:50.684 "name": "BaseBdev2", 00:25:50.684 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:50.684 "is_configured": true, 00:25:50.684 "data_offset": 256, 00:25:50.684 "data_size": 7936 00:25:50.684 } 00:25:50.684 ] 00:25:50.684 }' 00:25:50.684 16:04:55 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@708 -- # break 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:50.684 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:50.943 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:50.944 "name": "raid_bdev1", 00:25:50.944 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:50.944 "strip_size_kb": 0, 00:25:50.944 "state": "online", 00:25:50.944 "raid_level": "raid1", 00:25:50.944 "superblock": true, 00:25:50.944 "num_base_bdevs": 2, 00:25:50.944 "num_base_bdevs_discovered": 2, 00:25:50.944 "num_base_bdevs_operational": 2, 00:25:50.944 "base_bdevs_list": [ 00:25:50.944 { 00:25:50.944 "name": "spare", 00:25:50.944 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:50.944 "is_configured": true, 00:25:50.944 "data_offset": 256, 00:25:50.944 "data_size": 7936 00:25:50.944 }, 00:25:50.944 { 00:25:50.944 "name": "BaseBdev2", 00:25:50.944 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:50.944 "is_configured": true, 00:25:50.944 "data_offset": 256, 00:25:50.944 "data_size": 7936 00:25:50.944 } 00:25:50.944 ] 00:25:50.944 }' 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:50.944 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:51.203 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:51.203 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:51.203 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:51.203 "name": "raid_bdev1", 00:25:51.203 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:51.203 "strip_size_kb": 0, 00:25:51.203 "state": "online", 00:25:51.203 "raid_level": "raid1", 00:25:51.203 "superblock": true, 00:25:51.203 "num_base_bdevs": 2, 00:25:51.203 "num_base_bdevs_discovered": 2, 00:25:51.203 "num_base_bdevs_operational": 2, 00:25:51.203 "base_bdevs_list": [ 00:25:51.203 { 00:25:51.203 "name": "spare", 00:25:51.203 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:51.203 "is_configured": true, 00:25:51.203 "data_offset": 256, 00:25:51.203 "data_size": 7936 00:25:51.203 }, 00:25:51.203 { 00:25:51.203 "name": "BaseBdev2", 00:25:51.203 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:51.203 "is_configured": true, 00:25:51.203 "data_offset": 256, 00:25:51.203 "data_size": 7936 00:25:51.203 } 00:25:51.203 ] 00:25:51.203 }' 00:25:51.203 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:51.203 16:04:56 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:52.140 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:25:52.140 [2024-06-10 16:04:57.498876] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:25:52.140 [2024-06-10 16:04:57.498905] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:25:52.140 [2024-06-10 16:04:57.498967] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:25:52.140 [2024-06-10 16:04:57.499025] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:25:52.140 [2024-06-10 16:04:57.499034] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1046da0 name raid_bdev1, state offline 00:25:52.140 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:52.140 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # jq length 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@10 -- # local bdev_list 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@11 -- # local nbd_list 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@12 -- # local i 00:25:52.399 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:25:52.400 /dev/nbd0 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:52.400 1+0 records in 00:25:52.400 1+0 records out 00:25:52.400 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000187001 s, 21.9 MB/s 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:52.400 16:04:57 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:25:52.658 /dev/nbd1 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@868 -- # local i 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:25:52.658 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:25:52.659 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@872 -- # break 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:25:52.917 1+0 records in 00:25:52.917 1+0 records out 00:25:52.917 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250695 s, 16.3 MB/s 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@885 -- # size=4096 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@888 -- # return 0 00:25:52.917 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@50 -- # local nbd_list 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@51 -- # local i 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:52.918 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:25:53.177 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@41 -- # break 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/nbd_common.sh@45 -- # return 0 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:25:53.467 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:53.725 16:04:58 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:53.725 [2024-06-10 16:04:59.135431] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:53.725 [2024-06-10 16:04:59.135467] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:53.725 [2024-06-10 16:04:59.135486] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1044700 00:25:53.725 [2024-06-10 16:04:59.135495] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:53.725 [2024-06-10 16:04:59.137181] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:53.725 [2024-06-10 16:04:59.137208] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:53.725 [2024-06-10 16:04:59.137278] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:53.725 [2024-06-10 16:04:59.137302] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:53.725 [2024-06-10 16:04:59.137404] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:25:53.725 spare 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:53.725 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:53.984 [2024-06-10 16:04:59.237717] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1048500 00:25:53.984 [2024-06-10 16:04:59.237729] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:25:53.984 [2024-06-10 16:04:59.237932] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1043040 00:25:53.984 [2024-06-10 16:04:59.238088] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1048500 00:25:53.984 [2024-06-10 16:04:59.238097] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1048500 00:25:53.984 [2024-06-10 16:04:59.238206] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:53.984 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:53.984 "name": "raid_bdev1", 00:25:53.984 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:53.984 "strip_size_kb": 0, 00:25:53.984 "state": "online", 00:25:53.984 "raid_level": "raid1", 00:25:53.984 "superblock": true, 00:25:53.984 "num_base_bdevs": 2, 00:25:53.984 "num_base_bdevs_discovered": 2, 00:25:53.984 "num_base_bdevs_operational": 2, 00:25:53.984 "base_bdevs_list": [ 00:25:53.984 { 00:25:53.984 "name": "spare", 00:25:53.984 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:53.984 "is_configured": true, 00:25:53.984 "data_offset": 256, 00:25:53.984 "data_size": 7936 00:25:53.984 }, 00:25:53.984 { 00:25:53.984 "name": "BaseBdev2", 00:25:53.984 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:53.984 "is_configured": true, 00:25:53.984 "data_offset": 256, 00:25:53.984 "data_size": 7936 00:25:53.984 } 00:25:53.984 ] 00:25:53.985 }' 00:25:53.985 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:53.985 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.551 16:04:59 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:54.810 "name": "raid_bdev1", 00:25:54.810 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:54.810 "strip_size_kb": 0, 00:25:54.810 "state": "online", 00:25:54.810 "raid_level": "raid1", 00:25:54.810 "superblock": true, 00:25:54.810 "num_base_bdevs": 2, 00:25:54.810 "num_base_bdevs_discovered": 2, 00:25:54.810 "num_base_bdevs_operational": 2, 00:25:54.810 "base_bdevs_list": [ 00:25:54.810 { 00:25:54.810 "name": "spare", 00:25:54.810 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:54.810 "is_configured": true, 00:25:54.810 "data_offset": 256, 00:25:54.810 "data_size": 7936 00:25:54.810 }, 00:25:54.810 { 00:25:54.810 "name": "BaseBdev2", 00:25:54.810 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:54.810 "is_configured": true, 00:25:54.810 "data_offset": 256, 00:25:54.810 "data_size": 7936 00:25:54.810 } 00:25:54.810 ] 00:25:54.810 }' 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:54.810 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:25:55.068 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:25:55.068 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:25:55.327 [2024-06-10 16:05:00.796135] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:55.327 16:05:00 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:55.586 16:05:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:55.586 "name": "raid_bdev1", 00:25:55.586 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:55.586 "strip_size_kb": 0, 00:25:55.586 "state": "online", 00:25:55.586 "raid_level": "raid1", 00:25:55.586 "superblock": true, 00:25:55.586 "num_base_bdevs": 2, 00:25:55.586 "num_base_bdevs_discovered": 1, 00:25:55.586 "num_base_bdevs_operational": 1, 00:25:55.586 "base_bdevs_list": [ 00:25:55.586 { 00:25:55.586 "name": null, 00:25:55.586 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:55.586 "is_configured": false, 00:25:55.586 "data_offset": 256, 00:25:55.586 "data_size": 7936 00:25:55.586 }, 00:25:55.586 { 00:25:55.586 "name": "BaseBdev2", 00:25:55.586 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:55.586 "is_configured": true, 00:25:55.586 "data_offset": 256, 00:25:55.586 "data_size": 7936 00:25:55.586 } 00:25:55.586 ] 00:25:55.586 }' 00:25:55.586 16:05:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:55.586 16:05:01 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:56.524 16:05:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:25:56.524 [2024-06-10 16:05:01.927188] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:56.524 [2024-06-10 16:05:01.927333] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:56.524 [2024-06-10 16:05:01.927347] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:56.524 [2024-06-10 16:05:01.927373] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:56.524 [2024-06-10 16:05:01.932051] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe8fb50 00:25:56.524 [2024-06-10 16:05:01.934283] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:56.524 16:05:01 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@755 -- # sleep 1 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:57.461 16:05:02 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:57.721 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:25:57.721 "name": "raid_bdev1", 00:25:57.721 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:57.721 "strip_size_kb": 0, 00:25:57.721 "state": "online", 00:25:57.721 "raid_level": "raid1", 00:25:57.721 "superblock": true, 00:25:57.721 "num_base_bdevs": 2, 00:25:57.721 "num_base_bdevs_discovered": 2, 00:25:57.721 "num_base_bdevs_operational": 2, 00:25:57.721 "process": { 00:25:57.721 "type": "rebuild", 00:25:57.721 "target": "spare", 00:25:57.721 "progress": { 00:25:57.721 "blocks": 2816, 00:25:57.721 "percent": 35 00:25:57.721 } 00:25:57.721 }, 00:25:57.721 "base_bdevs_list": [ 00:25:57.721 { 00:25:57.721 "name": "spare", 00:25:57.721 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:25:57.721 "is_configured": true, 00:25:57.721 "data_offset": 256, 00:25:57.721 "data_size": 7936 00:25:57.721 }, 00:25:57.721 { 00:25:57.721 "name": "BaseBdev2", 00:25:57.721 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:57.721 "is_configured": true, 00:25:57.721 "data_offset": 256, 00:25:57.721 "data_size": 7936 00:25:57.721 } 00:25:57.721 ] 00:25:57.721 }' 00:25:57.721 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:25:57.721 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:25:57.721 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:25:57.980 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:25:57.980 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:25:57.980 [2024-06-10 16:05:03.464539] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:58.240 [2024-06-10 16:05:03.546564] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:25:58.240 [2024-06-10 16:05:03.546607] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:25:58.240 [2024-06-10 16:05:03.546621] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:25:58.240 [2024-06-10 16:05:03.546627] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:25:58.240 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:25:58.499 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:25:58.499 "name": "raid_bdev1", 00:25:58.499 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:25:58.499 "strip_size_kb": 0, 00:25:58.499 "state": "online", 00:25:58.499 "raid_level": "raid1", 00:25:58.499 "superblock": true, 00:25:58.499 "num_base_bdevs": 2, 00:25:58.499 "num_base_bdevs_discovered": 1, 00:25:58.499 "num_base_bdevs_operational": 1, 00:25:58.499 "base_bdevs_list": [ 00:25:58.499 { 00:25:58.499 "name": null, 00:25:58.499 "uuid": "00000000-0000-0000-0000-000000000000", 00:25:58.499 "is_configured": false, 00:25:58.499 "data_offset": 256, 00:25:58.499 "data_size": 7936 00:25:58.499 }, 00:25:58.499 { 00:25:58.499 "name": "BaseBdev2", 00:25:58.499 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:25:58.499 "is_configured": true, 00:25:58.499 "data_offset": 256, 00:25:58.499 "data_size": 7936 00:25:58.499 } 00:25:58.499 ] 00:25:58.499 }' 00:25:58.499 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:25:58.499 16:05:03 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:25:59.067 16:05:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:25:59.326 [2024-06-10 16:05:04.685949] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:25:59.326 [2024-06-10 16:05:04.686000] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:25:59.326 [2024-06-10 16:05:04.686021] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1046b20 00:25:59.326 [2024-06-10 16:05:04.686030] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:25:59.326 [2024-06-10 16:05:04.686404] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:25:59.326 [2024-06-10 16:05:04.686420] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:25:59.326 [2024-06-10 16:05:04.686497] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:25:59.326 [2024-06-10 16:05:04.686507] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:25:59.326 [2024-06-10 16:05:04.686514] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:25:59.326 [2024-06-10 16:05:04.686530] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:25:59.326 [2024-06-10 16:05:04.691201] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1044be0 00:25:59.326 spare 00:25:59.326 [2024-06-10 16:05:04.692713] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:25:59.326 16:05:04 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:00.265 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:00.554 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:00.554 "name": "raid_bdev1", 00:26:00.554 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:00.554 "strip_size_kb": 0, 00:26:00.554 "state": "online", 00:26:00.554 "raid_level": "raid1", 00:26:00.554 "superblock": true, 00:26:00.554 "num_base_bdevs": 2, 00:26:00.554 "num_base_bdevs_discovered": 2, 00:26:00.554 "num_base_bdevs_operational": 2, 00:26:00.554 "process": { 00:26:00.554 "type": "rebuild", 00:26:00.554 "target": "spare", 00:26:00.554 "progress": { 00:26:00.554 "blocks": 3072, 00:26:00.554 "percent": 38 00:26:00.554 } 00:26:00.554 }, 00:26:00.554 "base_bdevs_list": [ 00:26:00.554 { 00:26:00.554 "name": "spare", 00:26:00.554 "uuid": "94f2d8af-7594-5bac-8cdd-39723e02bac7", 00:26:00.554 "is_configured": true, 00:26:00.554 "data_offset": 256, 00:26:00.554 "data_size": 7936 00:26:00.554 }, 00:26:00.554 { 00:26:00.554 "name": "BaseBdev2", 00:26:00.554 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:00.554 "is_configured": true, 00:26:00.554 "data_offset": 256, 00:26:00.554 "data_size": 7936 00:26:00.554 } 00:26:00.554 ] 00:26:00.554 }' 00:26:00.554 16:05:05 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:00.554 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:00.554 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:00.813 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:00.813 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:00.813 [2024-06-10 16:05:06.295847] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:00.813 [2024-06-10 16:05:06.305163] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:00.813 [2024-06-10 16:05:06.305206] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:00.813 [2024-06-10 16:05:06.305220] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:00.813 [2024-06-10 16:05:06.305226] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.070 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.329 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:01.329 "name": "raid_bdev1", 00:26:01.329 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:01.329 "strip_size_kb": 0, 00:26:01.329 "state": "online", 00:26:01.329 "raid_level": "raid1", 00:26:01.329 "superblock": true, 00:26:01.329 "num_base_bdevs": 2, 00:26:01.329 "num_base_bdevs_discovered": 1, 00:26:01.329 "num_base_bdevs_operational": 1, 00:26:01.329 "base_bdevs_list": [ 00:26:01.329 { 00:26:01.329 "name": null, 00:26:01.329 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.329 "is_configured": false, 00:26:01.329 "data_offset": 256, 00:26:01.329 "data_size": 7936 00:26:01.329 }, 00:26:01.329 { 00:26:01.329 "name": "BaseBdev2", 00:26:01.329 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:01.329 "is_configured": true, 00:26:01.329 "data_offset": 256, 00:26:01.329 "data_size": 7936 00:26:01.329 } 00:26:01.329 ] 00:26:01.329 }' 00:26:01.329 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:01.329 16:05:06 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:01.897 "name": "raid_bdev1", 00:26:01.897 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:01.897 "strip_size_kb": 0, 00:26:01.897 "state": "online", 00:26:01.897 "raid_level": "raid1", 00:26:01.897 "superblock": true, 00:26:01.897 "num_base_bdevs": 2, 00:26:01.897 "num_base_bdevs_discovered": 1, 00:26:01.897 "num_base_bdevs_operational": 1, 00:26:01.897 "base_bdevs_list": [ 00:26:01.897 { 00:26:01.897 "name": null, 00:26:01.897 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:01.897 "is_configured": false, 00:26:01.897 "data_offset": 256, 00:26:01.897 "data_size": 7936 00:26:01.897 }, 00:26:01.897 { 00:26:01.897 "name": "BaseBdev2", 00:26:01.897 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:01.897 "is_configured": true, 00:26:01.897 "data_offset": 256, 00:26:01.897 "data_size": 7936 00:26:01.897 } 00:26:01.897 ] 00:26:01.897 }' 00:26:01.897 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:02.156 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:02.156 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:02.156 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:02.156 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:26:02.156 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:02.415 [2024-06-10 16:05:07.897851] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:02.415 [2024-06-10 16:05:07.897902] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:02.415 [2024-06-10 16:05:07.897921] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe8f710 00:26:02.415 [2024-06-10 16:05:07.897931] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:02.415 [2024-06-10 16:05:07.898279] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:02.415 [2024-06-10 16:05:07.898294] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:02.415 [2024-06-10 16:05:07.898353] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:26:02.415 [2024-06-10 16:05:07.898364] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:02.415 [2024-06-10 16:05:07.898371] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:02.415 BaseBdev1 00:26:02.415 16:05:07 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@773 -- # sleep 1 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:03.794 16:05:08 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:03.794 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:03.794 "name": "raid_bdev1", 00:26:03.794 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:03.794 "strip_size_kb": 0, 00:26:03.794 "state": "online", 00:26:03.794 "raid_level": "raid1", 00:26:03.794 "superblock": true, 00:26:03.794 "num_base_bdevs": 2, 00:26:03.794 "num_base_bdevs_discovered": 1, 00:26:03.794 "num_base_bdevs_operational": 1, 00:26:03.794 "base_bdevs_list": [ 00:26:03.794 { 00:26:03.794 "name": null, 00:26:03.794 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:03.794 "is_configured": false, 00:26:03.794 "data_offset": 256, 00:26:03.794 "data_size": 7936 00:26:03.794 }, 00:26:03.794 { 00:26:03.794 "name": "BaseBdev2", 00:26:03.794 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:03.794 "is_configured": true, 00:26:03.794 "data_offset": 256, 00:26:03.794 "data_size": 7936 00:26:03.794 } 00:26:03.794 ] 00:26:03.794 }' 00:26:03.794 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:03.794 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:04.362 16:05:09 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:04.621 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:04.621 "name": "raid_bdev1", 00:26:04.621 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:04.621 "strip_size_kb": 0, 00:26:04.621 "state": "online", 00:26:04.621 "raid_level": "raid1", 00:26:04.621 "superblock": true, 00:26:04.621 "num_base_bdevs": 2, 00:26:04.621 "num_base_bdevs_discovered": 1, 00:26:04.621 "num_base_bdevs_operational": 1, 00:26:04.621 "base_bdevs_list": [ 00:26:04.621 { 00:26:04.621 "name": null, 00:26:04.621 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:04.621 "is_configured": false, 00:26:04.621 "data_offset": 256, 00:26:04.621 "data_size": 7936 00:26:04.621 }, 00:26:04.621 { 00:26:04.621 "name": "BaseBdev2", 00:26:04.621 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:04.621 "is_configured": true, 00:26:04.621 "data_offset": 256, 00:26:04.621 "data_size": 7936 00:26:04.621 } 00:26:04.621 ] 00:26:04.621 }' 00:26:04.621 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:04.621 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:04.621 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:04.879 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@649 -- # local es=0 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:04.880 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:26:05.139 [2024-06-10 16:05:10.412754] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:05.139 [2024-06-10 16:05:10.412871] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:26:05.139 [2024-06-10 16:05:10.412884] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:26:05.139 request: 00:26:05.139 { 00:26:05.139 "raid_bdev": "raid_bdev1", 00:26:05.139 "base_bdev": "BaseBdev1", 00:26:05.139 "method": "bdev_raid_add_base_bdev", 00:26:05.139 "req_id": 1 00:26:05.139 } 00:26:05.139 Got JSON-RPC error response 00:26:05.139 response: 00:26:05.139 { 00:26:05.139 "code": -22, 00:26:05.139 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:26:05.139 } 00:26:05.139 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@652 -- # es=1 00:26:05.139 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:26:05.139 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:26:05.139 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:26:05.139 16:05:10 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@777 -- # sleep 1 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.076 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:06.336 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:06.336 "name": "raid_bdev1", 00:26:06.336 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:06.336 "strip_size_kb": 0, 00:26:06.336 "state": "online", 00:26:06.336 "raid_level": "raid1", 00:26:06.336 "superblock": true, 00:26:06.336 "num_base_bdevs": 2, 00:26:06.336 "num_base_bdevs_discovered": 1, 00:26:06.336 "num_base_bdevs_operational": 1, 00:26:06.336 "base_bdevs_list": [ 00:26:06.336 { 00:26:06.336 "name": null, 00:26:06.336 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:06.336 "is_configured": false, 00:26:06.336 "data_offset": 256, 00:26:06.336 "data_size": 7936 00:26:06.336 }, 00:26:06.336 { 00:26:06.336 "name": "BaseBdev2", 00:26:06.336 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:06.336 "is_configured": true, 00:26:06.336 "data_offset": 256, 00:26:06.336 "data_size": 7936 00:26:06.336 } 00:26:06.336 ] 00:26:06.336 }' 00:26:06.336 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:06.336 16:05:11 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:06.904 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:07.163 "name": "raid_bdev1", 00:26:07.163 "uuid": "b114e9ee-53b1-4c25-81a9-d9acd572f637", 00:26:07.163 "strip_size_kb": 0, 00:26:07.163 "state": "online", 00:26:07.163 "raid_level": "raid1", 00:26:07.163 "superblock": true, 00:26:07.163 "num_base_bdevs": 2, 00:26:07.163 "num_base_bdevs_discovered": 1, 00:26:07.163 "num_base_bdevs_operational": 1, 00:26:07.163 "base_bdevs_list": [ 00:26:07.163 { 00:26:07.163 "name": null, 00:26:07.163 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:07.163 "is_configured": false, 00:26:07.163 "data_offset": 256, 00:26:07.163 "data_size": 7936 00:26:07.163 }, 00:26:07.163 { 00:26:07.163 "name": "BaseBdev2", 00:26:07.163 "uuid": "eeec971e-939f-5d82-b833-305bcb4a8267", 00:26:07.163 "is_configured": true, 00:26:07.163 "data_offset": 256, 00:26:07.163 "data_size": 7936 00:26:07.163 } 00:26:07.163 ] 00:26:07.163 }' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@782 -- # killprocess 2806949 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@949 -- # '[' -z 2806949 ']' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@953 -- # kill -0 2806949 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # uname 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2806949 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2806949' 00:26:07.163 killing process with pid 2806949 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@968 -- # kill 2806949 00:26:07.163 Received shutdown signal, test time was about 60.000000 seconds 00:26:07.163 00:26:07.163 Latency(us) 00:26:07.163 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:26:07.163 =================================================================================================================== 00:26:07.163 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:26:07.163 [2024-06-10 16:05:12.625498] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:07.163 [2024-06-10 16:05:12.625587] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:07.163 [2024-06-10 16:05:12.625631] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:07.163 [2024-06-10 16:05:12.625640] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1048500 name raid_bdev1, state offline 00:26:07.163 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@973 -- # wait 2806949 00:26:07.163 [2024-06-10 16:05:12.650502] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:07.422 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- bdev/bdev_raid.sh@784 -- # return 0 00:26:07.422 00:26:07.422 real 0m31.474s 00:26:07.422 user 0m50.244s 00:26:07.422 sys 0m4.087s 00:26:07.422 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:07.422 16:05:12 bdev_raid.raid_rebuild_test_sb_4k -- common/autotest_common.sh@10 -- # set +x 00:26:07.422 ************************************ 00:26:07.422 END TEST raid_rebuild_test_sb_4k 00:26:07.422 ************************************ 00:26:07.422 16:05:12 bdev_raid -- bdev/bdev_raid.sh@904 -- # base_malloc_params='-m 32' 00:26:07.422 16:05:12 bdev_raid -- bdev/bdev_raid.sh@905 -- # run_test raid_state_function_test_sb_md_separate raid_state_function_test raid1 2 true 00:26:07.422 16:05:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:26:07.422 16:05:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:07.422 16:05:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:07.422 ************************************ 00:26:07.422 START TEST raid_state_function_test_sb_md_separate 00:26:07.422 ************************************ 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@226 -- # local strip_size 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:26:07.422 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@244 -- # raid_pid=2812398 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2812398' 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:26:07.423 Process raid pid: 2812398 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@246 -- # waitforlisten 2812398 /var/tmp/spdk-raid.sock 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 2812398 ']' 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:07.423 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:07.423 16:05:12 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:07.681 [2024-06-10 16:05:12.983388] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:26:07.681 [2024-06-10 16:05:12.983442] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:26:07.681 [2024-06-10 16:05:13.082709] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:07.681 [2024-06-10 16:05:13.177831] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:07.940 [2024-06-10 16:05:13.238468] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:07.940 [2024-06-10 16:05:13.238498] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:08.507 16:05:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:08.507 16:05:13 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:26:08.507 16:05:13 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:08.766 [2024-06-10 16:05:14.169550] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:08.766 [2024-06-10 16:05:14.169588] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:08.766 [2024-06-10 16:05:14.169597] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:08.766 [2024-06-10 16:05:14.169606] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:08.766 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:09.025 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:09.025 "name": "Existed_Raid", 00:26:09.025 "uuid": "c5d78341-737f-4c16-bd67-9f8b9343a70c", 00:26:09.025 "strip_size_kb": 0, 00:26:09.025 "state": "configuring", 00:26:09.025 "raid_level": "raid1", 00:26:09.025 "superblock": true, 00:26:09.025 "num_base_bdevs": 2, 00:26:09.025 "num_base_bdevs_discovered": 0, 00:26:09.025 "num_base_bdevs_operational": 2, 00:26:09.025 "base_bdevs_list": [ 00:26:09.025 { 00:26:09.025 "name": "BaseBdev1", 00:26:09.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.025 "is_configured": false, 00:26:09.025 "data_offset": 0, 00:26:09.025 "data_size": 0 00:26:09.025 }, 00:26:09.025 { 00:26:09.025 "name": "BaseBdev2", 00:26:09.025 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:09.025 "is_configured": false, 00:26:09.025 "data_offset": 0, 00:26:09.025 "data_size": 0 00:26:09.025 } 00:26:09.025 ] 00:26:09.025 }' 00:26:09.025 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:09.025 16:05:14 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:09.594 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:09.853 [2024-06-10 16:05:15.300419] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:09.853 [2024-06-10 16:05:15.300446] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x193c120 name Existed_Raid, state configuring 00:26:09.853 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:10.111 [2024-06-10 16:05:15.557110] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:26:10.111 [2024-06-10 16:05:15.557136] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:26:10.111 [2024-06-10 16:05:15.557144] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:10.111 [2024-06-10 16:05:15.557152] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:10.111 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1 00:26:10.369 [2024-06-10 16:05:15.812036] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:10.369 BaseBdev1 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:10.369 16:05:15 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:10.627 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:26:10.884 [ 00:26:10.884 { 00:26:10.884 "name": "BaseBdev1", 00:26:10.884 "aliases": [ 00:26:10.884 "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33" 00:26:10.884 ], 00:26:10.884 "product_name": "Malloc disk", 00:26:10.884 "block_size": 4096, 00:26:10.884 "num_blocks": 8192, 00:26:10.884 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:10.884 "md_size": 32, 00:26:10.884 "md_interleave": false, 00:26:10.884 "dif_type": 0, 00:26:10.884 "assigned_rate_limits": { 00:26:10.884 "rw_ios_per_sec": 0, 00:26:10.884 "rw_mbytes_per_sec": 0, 00:26:10.884 "r_mbytes_per_sec": 0, 00:26:10.884 "w_mbytes_per_sec": 0 00:26:10.884 }, 00:26:10.884 "claimed": true, 00:26:10.884 "claim_type": "exclusive_write", 00:26:10.884 "zoned": false, 00:26:10.884 "supported_io_types": { 00:26:10.884 "read": true, 00:26:10.884 "write": true, 00:26:10.884 "unmap": true, 00:26:10.884 "write_zeroes": true, 00:26:10.884 "flush": true, 00:26:10.884 "reset": true, 00:26:10.884 "compare": false, 00:26:10.884 "compare_and_write": false, 00:26:10.884 "abort": true, 00:26:10.884 "nvme_admin": false, 00:26:10.884 "nvme_io": false 00:26:10.884 }, 00:26:10.884 "memory_domains": [ 00:26:10.884 { 00:26:10.884 "dma_device_id": "system", 00:26:10.884 "dma_device_type": 1 00:26:10.884 }, 00:26:10.884 { 00:26:10.884 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:10.884 "dma_device_type": 2 00:26:10.884 } 00:26:10.884 ], 00:26:10.884 "driver_specific": {} 00:26:10.884 } 00:26:10.884 ] 00:26:10.884 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:26:10.884 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:10.885 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:11.143 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:11.143 "name": "Existed_Raid", 00:26:11.143 "uuid": "38e7aa4b-e645-447c-9bac-654b582492c8", 00:26:11.143 "strip_size_kb": 0, 00:26:11.143 "state": "configuring", 00:26:11.143 "raid_level": "raid1", 00:26:11.143 "superblock": true, 00:26:11.143 "num_base_bdevs": 2, 00:26:11.143 "num_base_bdevs_discovered": 1, 00:26:11.143 "num_base_bdevs_operational": 2, 00:26:11.143 "base_bdevs_list": [ 00:26:11.143 { 00:26:11.143 "name": "BaseBdev1", 00:26:11.143 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:11.143 "is_configured": true, 00:26:11.143 "data_offset": 256, 00:26:11.143 "data_size": 7936 00:26:11.143 }, 00:26:11.143 { 00:26:11.143 "name": "BaseBdev2", 00:26:11.143 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:11.143 "is_configured": false, 00:26:11.143 "data_offset": 0, 00:26:11.143 "data_size": 0 00:26:11.143 } 00:26:11.143 ] 00:26:11.143 }' 00:26:11.143 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:11.143 16:05:16 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:12.079 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:26:12.079 [2024-06-10 16:05:17.456493] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:26:12.079 [2024-06-10 16:05:17.456530] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x193b9f0 name Existed_Raid, state configuring 00:26:12.079 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:26:12.338 [2024-06-10 16:05:17.713200] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:12.338 [2024-06-10 16:05:17.714724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:26:12.338 [2024-06-10 16:05:17.714755] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:12.338 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:12.597 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:12.597 "name": "Existed_Raid", 00:26:12.597 "uuid": "ea49c2ef-4986-4ccc-afbf-e75c98dcf924", 00:26:12.597 "strip_size_kb": 0, 00:26:12.597 "state": "configuring", 00:26:12.597 "raid_level": "raid1", 00:26:12.597 "superblock": true, 00:26:12.597 "num_base_bdevs": 2, 00:26:12.597 "num_base_bdevs_discovered": 1, 00:26:12.597 "num_base_bdevs_operational": 2, 00:26:12.597 "base_bdevs_list": [ 00:26:12.597 { 00:26:12.597 "name": "BaseBdev1", 00:26:12.597 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:12.597 "is_configured": true, 00:26:12.597 "data_offset": 256, 00:26:12.597 "data_size": 7936 00:26:12.597 }, 00:26:12.597 { 00:26:12.597 "name": "BaseBdev2", 00:26:12.597 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:12.597 "is_configured": false, 00:26:12.597 "data_offset": 0, 00:26:12.597 "data_size": 0 00:26:12.597 } 00:26:12.597 ] 00:26:12.597 }' 00:26:12.597 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:12.597 16:05:17 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:13.164 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2 00:26:13.422 [2024-06-10 16:05:18.832060] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:13.422 [2024-06-10 16:05:18.832198] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x193d980 00:26:13.422 [2024-06-10 16:05:18.832209] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:13.422 [2024-06-10 16:05:18.832271] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x193d3c0 00:26:13.422 [2024-06-10 16:05:18.832370] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x193d980 00:26:13.422 [2024-06-10 16:05:18.832379] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x193d980 00:26:13.422 [2024-06-10 16:05:18.832446] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:13.422 BaseBdev2 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@900 -- # local i 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:26:13.422 16:05:18 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:26:13.680 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:26:13.939 [ 00:26:13.939 { 00:26:13.939 "name": "BaseBdev2", 00:26:13.939 "aliases": [ 00:26:13.939 "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd" 00:26:13.939 ], 00:26:13.939 "product_name": "Malloc disk", 00:26:13.939 "block_size": 4096, 00:26:13.939 "num_blocks": 8192, 00:26:13.939 "uuid": "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd", 00:26:13.939 "md_size": 32, 00:26:13.939 "md_interleave": false, 00:26:13.939 "dif_type": 0, 00:26:13.939 "assigned_rate_limits": { 00:26:13.939 "rw_ios_per_sec": 0, 00:26:13.939 "rw_mbytes_per_sec": 0, 00:26:13.939 "r_mbytes_per_sec": 0, 00:26:13.939 "w_mbytes_per_sec": 0 00:26:13.939 }, 00:26:13.939 "claimed": true, 00:26:13.939 "claim_type": "exclusive_write", 00:26:13.939 "zoned": false, 00:26:13.939 "supported_io_types": { 00:26:13.939 "read": true, 00:26:13.939 "write": true, 00:26:13.939 "unmap": true, 00:26:13.939 "write_zeroes": true, 00:26:13.939 "flush": true, 00:26:13.939 "reset": true, 00:26:13.939 "compare": false, 00:26:13.939 "compare_and_write": false, 00:26:13.939 "abort": true, 00:26:13.939 "nvme_admin": false, 00:26:13.939 "nvme_io": false 00:26:13.939 }, 00:26:13.939 "memory_domains": [ 00:26:13.939 { 00:26:13.939 "dma_device_id": "system", 00:26:13.939 "dma_device_type": 1 00:26:13.939 }, 00:26:13.939 { 00:26:13.939 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:13.939 "dma_device_type": 2 00:26:13.939 } 00:26:13.939 ], 00:26:13.939 "driver_specific": {} 00:26:13.939 } 00:26:13.939 ] 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@906 -- # return 0 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:13.939 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:14.197 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:14.197 "name": "Existed_Raid", 00:26:14.197 "uuid": "ea49c2ef-4986-4ccc-afbf-e75c98dcf924", 00:26:14.197 "strip_size_kb": 0, 00:26:14.197 "state": "online", 00:26:14.197 "raid_level": "raid1", 00:26:14.197 "superblock": true, 00:26:14.197 "num_base_bdevs": 2, 00:26:14.198 "num_base_bdevs_discovered": 2, 00:26:14.198 "num_base_bdevs_operational": 2, 00:26:14.198 "base_bdevs_list": [ 00:26:14.198 { 00:26:14.198 "name": "BaseBdev1", 00:26:14.198 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:14.198 "is_configured": true, 00:26:14.198 "data_offset": 256, 00:26:14.198 "data_size": 7936 00:26:14.198 }, 00:26:14.198 { 00:26:14.198 "name": "BaseBdev2", 00:26:14.198 "uuid": "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd", 00:26:14.198 "is_configured": true, 00:26:14.198 "data_offset": 256, 00:26:14.198 "data_size": 7936 00:26:14.198 } 00:26:14.198 ] 00:26:14.198 }' 00:26:14.198 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:14.198 16:05:19 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:26:14.765 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:15.026 [2024-06-10 16:05:20.480757] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:15.026 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:15.026 "name": "Existed_Raid", 00:26:15.026 "aliases": [ 00:26:15.026 "ea49c2ef-4986-4ccc-afbf-e75c98dcf924" 00:26:15.026 ], 00:26:15.026 "product_name": "Raid Volume", 00:26:15.026 "block_size": 4096, 00:26:15.026 "num_blocks": 7936, 00:26:15.026 "uuid": "ea49c2ef-4986-4ccc-afbf-e75c98dcf924", 00:26:15.026 "md_size": 32, 00:26:15.026 "md_interleave": false, 00:26:15.026 "dif_type": 0, 00:26:15.026 "assigned_rate_limits": { 00:26:15.026 "rw_ios_per_sec": 0, 00:26:15.026 "rw_mbytes_per_sec": 0, 00:26:15.026 "r_mbytes_per_sec": 0, 00:26:15.026 "w_mbytes_per_sec": 0 00:26:15.026 }, 00:26:15.026 "claimed": false, 00:26:15.026 "zoned": false, 00:26:15.026 "supported_io_types": { 00:26:15.026 "read": true, 00:26:15.026 "write": true, 00:26:15.026 "unmap": false, 00:26:15.026 "write_zeroes": true, 00:26:15.026 "flush": false, 00:26:15.026 "reset": true, 00:26:15.026 "compare": false, 00:26:15.026 "compare_and_write": false, 00:26:15.026 "abort": false, 00:26:15.026 "nvme_admin": false, 00:26:15.026 "nvme_io": false 00:26:15.026 }, 00:26:15.026 "memory_domains": [ 00:26:15.026 { 00:26:15.026 "dma_device_id": "system", 00:26:15.026 "dma_device_type": 1 00:26:15.026 }, 00:26:15.026 { 00:26:15.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:15.026 "dma_device_type": 2 00:26:15.026 }, 00:26:15.026 { 00:26:15.026 "dma_device_id": "system", 00:26:15.026 "dma_device_type": 1 00:26:15.026 }, 00:26:15.026 { 00:26:15.026 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:15.026 "dma_device_type": 2 00:26:15.026 } 00:26:15.026 ], 00:26:15.026 "driver_specific": { 00:26:15.026 "raid": { 00:26:15.026 "uuid": "ea49c2ef-4986-4ccc-afbf-e75c98dcf924", 00:26:15.026 "strip_size_kb": 0, 00:26:15.026 "state": "online", 00:26:15.026 "raid_level": "raid1", 00:26:15.026 "superblock": true, 00:26:15.026 "num_base_bdevs": 2, 00:26:15.026 "num_base_bdevs_discovered": 2, 00:26:15.026 "num_base_bdevs_operational": 2, 00:26:15.026 "base_bdevs_list": [ 00:26:15.026 { 00:26:15.026 "name": "BaseBdev1", 00:26:15.026 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:15.026 "is_configured": true, 00:26:15.026 "data_offset": 256, 00:26:15.026 "data_size": 7936 00:26:15.026 }, 00:26:15.026 { 00:26:15.026 "name": "BaseBdev2", 00:26:15.026 "uuid": "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd", 00:26:15.026 "is_configured": true, 00:26:15.026 "data_offset": 256, 00:26:15.026 "data_size": 7936 00:26:15.026 } 00:26:15.026 ] 00:26:15.026 } 00:26:15.026 } 00:26:15.026 }' 00:26:15.026 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:26:15.356 BaseBdev2' 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:15.356 "name": "BaseBdev1", 00:26:15.356 "aliases": [ 00:26:15.356 "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33" 00:26:15.356 ], 00:26:15.356 "product_name": "Malloc disk", 00:26:15.356 "block_size": 4096, 00:26:15.356 "num_blocks": 8192, 00:26:15.356 "uuid": "3a3c765b-cf2b-46d5-b61c-e89fb7e89b33", 00:26:15.356 "md_size": 32, 00:26:15.356 "md_interleave": false, 00:26:15.356 "dif_type": 0, 00:26:15.356 "assigned_rate_limits": { 00:26:15.356 "rw_ios_per_sec": 0, 00:26:15.356 "rw_mbytes_per_sec": 0, 00:26:15.356 "r_mbytes_per_sec": 0, 00:26:15.356 "w_mbytes_per_sec": 0 00:26:15.356 }, 00:26:15.356 "claimed": true, 00:26:15.356 "claim_type": "exclusive_write", 00:26:15.356 "zoned": false, 00:26:15.356 "supported_io_types": { 00:26:15.356 "read": true, 00:26:15.356 "write": true, 00:26:15.356 "unmap": true, 00:26:15.356 "write_zeroes": true, 00:26:15.356 "flush": true, 00:26:15.356 "reset": true, 00:26:15.356 "compare": false, 00:26:15.356 "compare_and_write": false, 00:26:15.356 "abort": true, 00:26:15.356 "nvme_admin": false, 00:26:15.356 "nvme_io": false 00:26:15.356 }, 00:26:15.356 "memory_domains": [ 00:26:15.356 { 00:26:15.356 "dma_device_id": "system", 00:26:15.356 "dma_device_type": 1 00:26:15.356 }, 00:26:15.356 { 00:26:15.356 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:15.356 "dma_device_type": 2 00:26:15.356 } 00:26:15.356 ], 00:26:15.356 "driver_specific": {} 00:26:15.356 }' 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.356 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:15.615 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:15.615 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:15.615 16:05:20 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:15.615 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:15.615 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:15.615 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:15.615 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:15.615 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.873 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:15.873 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:15.873 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:15.873 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:15.873 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:16.131 "name": "BaseBdev2", 00:26:16.131 "aliases": [ 00:26:16.131 "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd" 00:26:16.131 ], 00:26:16.131 "product_name": "Malloc disk", 00:26:16.131 "block_size": 4096, 00:26:16.131 "num_blocks": 8192, 00:26:16.131 "uuid": "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd", 00:26:16.131 "md_size": 32, 00:26:16.131 "md_interleave": false, 00:26:16.131 "dif_type": 0, 00:26:16.131 "assigned_rate_limits": { 00:26:16.131 "rw_ios_per_sec": 0, 00:26:16.131 "rw_mbytes_per_sec": 0, 00:26:16.131 "r_mbytes_per_sec": 0, 00:26:16.131 "w_mbytes_per_sec": 0 00:26:16.131 }, 00:26:16.131 "claimed": true, 00:26:16.131 "claim_type": "exclusive_write", 00:26:16.131 "zoned": false, 00:26:16.131 "supported_io_types": { 00:26:16.131 "read": true, 00:26:16.131 "write": true, 00:26:16.131 "unmap": true, 00:26:16.131 "write_zeroes": true, 00:26:16.131 "flush": true, 00:26:16.131 "reset": true, 00:26:16.131 "compare": false, 00:26:16.131 "compare_and_write": false, 00:26:16.131 "abort": true, 00:26:16.131 "nvme_admin": false, 00:26:16.131 "nvme_io": false 00:26:16.131 }, 00:26:16.131 "memory_domains": [ 00:26:16.131 { 00:26:16.131 "dma_device_id": "system", 00:26:16.131 "dma_device_type": 1 00:26:16.131 }, 00:26:16.131 { 00:26:16.131 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:16.131 "dma_device_type": 2 00:26:16.131 } 00:26:16.131 ], 00:26:16.131 "driver_specific": {} 00:26:16.131 }' 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:16.131 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:16.390 16:05:21 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:26:16.648 [2024-06-10 16:05:22.028686] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@275 -- # local expected_state 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:16.648 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:26:16.906 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:16.906 "name": "Existed_Raid", 00:26:16.906 "uuid": "ea49c2ef-4986-4ccc-afbf-e75c98dcf924", 00:26:16.906 "strip_size_kb": 0, 00:26:16.906 "state": "online", 00:26:16.906 "raid_level": "raid1", 00:26:16.906 "superblock": true, 00:26:16.906 "num_base_bdevs": 2, 00:26:16.906 "num_base_bdevs_discovered": 1, 00:26:16.906 "num_base_bdevs_operational": 1, 00:26:16.906 "base_bdevs_list": [ 00:26:16.906 { 00:26:16.906 "name": null, 00:26:16.906 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:16.906 "is_configured": false, 00:26:16.906 "data_offset": 256, 00:26:16.906 "data_size": 7936 00:26:16.906 }, 00:26:16.906 { 00:26:16.906 "name": "BaseBdev2", 00:26:16.906 "uuid": "a7094139-47b2-4ee8-ab6c-42a4a21ad9bd", 00:26:16.906 "is_configured": true, 00:26:16.906 "data_offset": 256, 00:26:16.906 "data_size": 7936 00:26:16.906 } 00:26:16.906 ] 00:26:16.906 }' 00:26:16.906 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:16.906 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:17.473 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:26:17.473 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:17.473 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.473 16:05:22 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:26:17.732 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:26:17.732 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:26:17.732 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:26:17.991 [2024-06-10 16:05:23.434853] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:26:17.991 [2024-06-10 16:05:23.434937] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:17.991 [2024-06-10 16:05:23.446479] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:17.991 [2024-06-10 16:05:23.446515] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:17.991 [2024-06-10 16:05:23.446523] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x193d980 name Existed_Raid, state offline 00:26:17.991 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:26:17.991 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:26:17.991 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:17.991 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@341 -- # killprocess 2812398 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 2812398 ']' 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 2812398 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:18.250 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2812398 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2812398' 00:26:18.509 killing process with pid 2812398 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 2812398 00:26:18.509 [2024-06-10 16:05:23.768201] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 2812398 00:26:18.509 [2024-06-10 16:05:23.769046] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- bdev/bdev_raid.sh@343 -- # return 0 00:26:18.509 00:26:18.509 real 0m11.045s 00:26:18.509 user 0m20.123s 00:26:18.509 sys 0m1.632s 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:18.509 16:05:23 bdev_raid.raid_state_function_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:18.509 ************************************ 00:26:18.509 END TEST raid_state_function_test_sb_md_separate 00:26:18.509 ************************************ 00:26:18.509 16:05:24 bdev_raid -- bdev/bdev_raid.sh@906 -- # run_test raid_superblock_test_md_separate raid_superblock_test raid1 2 00:26:18.509 16:05:24 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:26:18.509 16:05:24 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:18.509 16:05:24 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:18.768 ************************************ 00:26:18.768 START TEST raid_superblock_test_md_separate 00:26:18.768 ************************************ 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@398 -- # local strip_size 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@411 -- # raid_pid=2814436 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@412 -- # waitforlisten 2814436 /var/tmp/spdk-raid.sock 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@830 -- # '[' -z 2814436 ']' 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:18.768 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:18.768 16:05:24 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:18.768 [2024-06-10 16:05:24.098627] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:26:18.768 [2024-06-10 16:05:24.098680] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2814436 ] 00:26:18.768 [2024-06-10 16:05:24.195729] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:19.027 [2024-06-10 16:05:24.289817] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:19.027 [2024-06-10 16:05:24.344702] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:19.027 [2024-06-10 16:05:24.344728] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@863 -- # return 0 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:19.594 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc1 00:26:19.853 malloc1 00:26:19.853 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:20.112 [2024-06-10 16:05:25.554299] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:20.112 [2024-06-10 16:05:25.554347] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.112 [2024-06-10 16:05:25.554366] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0ebc0 00:26:20.112 [2024-06-10 16:05:25.554376] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.112 [2024-06-10 16:05:25.555890] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.112 [2024-06-10 16:05:25.555916] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:20.112 pt1 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:26:20.112 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b malloc2 00:26:20.371 malloc2 00:26:20.371 16:05:25 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:20.630 [2024-06-10 16:05:26.065099] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:20.630 [2024-06-10 16:05:26.065142] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:20.630 [2024-06-10 16:05:26.065157] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8efb0 00:26:20.630 [2024-06-10 16:05:26.065167] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:20.630 [2024-06-10 16:05:26.066570] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:20.630 [2024-06-10 16:05:26.066594] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:20.630 pt2 00:26:20.630 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:26:20.630 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:26:20.630 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:26:20.888 [2024-06-10 16:05:26.317772] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:20.888 [2024-06-10 16:05:26.319142] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:20.888 [2024-06-10 16:05:26.319297] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe0f390 00:26:20.888 [2024-06-10 16:05:26.319310] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:20.888 [2024-06-10 16:05:26.319375] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf8fcf0 00:26:20.888 [2024-06-10 16:05:26.319489] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe0f390 00:26:20.888 [2024-06-10 16:05:26.319498] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe0f390 00:26:20.888 [2024-06-10 16:05:26.319568] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:20.888 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:21.147 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:21.147 "name": "raid_bdev1", 00:26:21.147 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:21.147 "strip_size_kb": 0, 00:26:21.147 "state": "online", 00:26:21.147 "raid_level": "raid1", 00:26:21.147 "superblock": true, 00:26:21.147 "num_base_bdevs": 2, 00:26:21.147 "num_base_bdevs_discovered": 2, 00:26:21.147 "num_base_bdevs_operational": 2, 00:26:21.147 "base_bdevs_list": [ 00:26:21.147 { 00:26:21.147 "name": "pt1", 00:26:21.147 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:21.147 "is_configured": true, 00:26:21.147 "data_offset": 256, 00:26:21.147 "data_size": 7936 00:26:21.147 }, 00:26:21.147 { 00:26:21.147 "name": "pt2", 00:26:21.147 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:21.147 "is_configured": true, 00:26:21.147 "data_offset": 256, 00:26:21.147 "data_size": 7936 00:26:21.147 } 00:26:21.147 ] 00:26:21.147 }' 00:26:21.147 16:05:26 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:21.147 16:05:26 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:21.714 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:26:21.715 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:21.715 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:21.715 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:21.715 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:21.715 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:21.974 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:21.974 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:21.974 [2024-06-10 16:05:27.461048] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:22.234 "name": "raid_bdev1", 00:26:22.234 "aliases": [ 00:26:22.234 "18fa153e-bf00-492c-8515-8271041dc284" 00:26:22.234 ], 00:26:22.234 "product_name": "Raid Volume", 00:26:22.234 "block_size": 4096, 00:26:22.234 "num_blocks": 7936, 00:26:22.234 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:22.234 "md_size": 32, 00:26:22.234 "md_interleave": false, 00:26:22.234 "dif_type": 0, 00:26:22.234 "assigned_rate_limits": { 00:26:22.234 "rw_ios_per_sec": 0, 00:26:22.234 "rw_mbytes_per_sec": 0, 00:26:22.234 "r_mbytes_per_sec": 0, 00:26:22.234 "w_mbytes_per_sec": 0 00:26:22.234 }, 00:26:22.234 "claimed": false, 00:26:22.234 "zoned": false, 00:26:22.234 "supported_io_types": { 00:26:22.234 "read": true, 00:26:22.234 "write": true, 00:26:22.234 "unmap": false, 00:26:22.234 "write_zeroes": true, 00:26:22.234 "flush": false, 00:26:22.234 "reset": true, 00:26:22.234 "compare": false, 00:26:22.234 "compare_and_write": false, 00:26:22.234 "abort": false, 00:26:22.234 "nvme_admin": false, 00:26:22.234 "nvme_io": false 00:26:22.234 }, 00:26:22.234 "memory_domains": [ 00:26:22.234 { 00:26:22.234 "dma_device_id": "system", 00:26:22.234 "dma_device_type": 1 00:26:22.234 }, 00:26:22.234 { 00:26:22.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.234 "dma_device_type": 2 00:26:22.234 }, 00:26:22.234 { 00:26:22.234 "dma_device_id": "system", 00:26:22.234 "dma_device_type": 1 00:26:22.234 }, 00:26:22.234 { 00:26:22.234 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.234 "dma_device_type": 2 00:26:22.234 } 00:26:22.234 ], 00:26:22.234 "driver_specific": { 00:26:22.234 "raid": { 00:26:22.234 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:22.234 "strip_size_kb": 0, 00:26:22.234 "state": "online", 00:26:22.234 "raid_level": "raid1", 00:26:22.234 "superblock": true, 00:26:22.234 "num_base_bdevs": 2, 00:26:22.234 "num_base_bdevs_discovered": 2, 00:26:22.234 "num_base_bdevs_operational": 2, 00:26:22.234 "base_bdevs_list": [ 00:26:22.234 { 00:26:22.234 "name": "pt1", 00:26:22.234 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:22.234 "is_configured": true, 00:26:22.234 "data_offset": 256, 00:26:22.234 "data_size": 7936 00:26:22.234 }, 00:26:22.234 { 00:26:22.234 "name": "pt2", 00:26:22.234 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:22.234 "is_configured": true, 00:26:22.234 "data_offset": 256, 00:26:22.234 "data_size": 7936 00:26:22.234 } 00:26:22.234 ] 00:26:22.234 } 00:26:22.234 } 00:26:22.234 }' 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:22.234 pt2' 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:22.234 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:22.493 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:22.493 "name": "pt1", 00:26:22.493 "aliases": [ 00:26:22.493 "00000000-0000-0000-0000-000000000001" 00:26:22.493 ], 00:26:22.493 "product_name": "passthru", 00:26:22.493 "block_size": 4096, 00:26:22.493 "num_blocks": 8192, 00:26:22.493 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:22.493 "md_size": 32, 00:26:22.493 "md_interleave": false, 00:26:22.493 "dif_type": 0, 00:26:22.493 "assigned_rate_limits": { 00:26:22.493 "rw_ios_per_sec": 0, 00:26:22.493 "rw_mbytes_per_sec": 0, 00:26:22.493 "r_mbytes_per_sec": 0, 00:26:22.493 "w_mbytes_per_sec": 0 00:26:22.493 }, 00:26:22.493 "claimed": true, 00:26:22.493 "claim_type": "exclusive_write", 00:26:22.493 "zoned": false, 00:26:22.493 "supported_io_types": { 00:26:22.493 "read": true, 00:26:22.493 "write": true, 00:26:22.493 "unmap": true, 00:26:22.493 "write_zeroes": true, 00:26:22.493 "flush": true, 00:26:22.493 "reset": true, 00:26:22.493 "compare": false, 00:26:22.493 "compare_and_write": false, 00:26:22.493 "abort": true, 00:26:22.493 "nvme_admin": false, 00:26:22.493 "nvme_io": false 00:26:22.493 }, 00:26:22.493 "memory_domains": [ 00:26:22.493 { 00:26:22.493 "dma_device_id": "system", 00:26:22.493 "dma_device_type": 1 00:26:22.493 }, 00:26:22.493 { 00:26:22.493 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:22.493 "dma_device_type": 2 00:26:22.493 } 00:26:22.493 ], 00:26:22.493 "driver_specific": { 00:26:22.493 "passthru": { 00:26:22.493 "name": "pt1", 00:26:22.494 "base_bdev_name": "malloc1" 00:26:22.494 } 00:26:22.494 } 00:26:22.494 }' 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:22.494 16:05:27 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:22.753 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:23.012 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:23.012 "name": "pt2", 00:26:23.012 "aliases": [ 00:26:23.012 "00000000-0000-0000-0000-000000000002" 00:26:23.012 ], 00:26:23.012 "product_name": "passthru", 00:26:23.012 "block_size": 4096, 00:26:23.012 "num_blocks": 8192, 00:26:23.012 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:23.012 "md_size": 32, 00:26:23.012 "md_interleave": false, 00:26:23.012 "dif_type": 0, 00:26:23.012 "assigned_rate_limits": { 00:26:23.012 "rw_ios_per_sec": 0, 00:26:23.012 "rw_mbytes_per_sec": 0, 00:26:23.012 "r_mbytes_per_sec": 0, 00:26:23.012 "w_mbytes_per_sec": 0 00:26:23.012 }, 00:26:23.012 "claimed": true, 00:26:23.012 "claim_type": "exclusive_write", 00:26:23.012 "zoned": false, 00:26:23.012 "supported_io_types": { 00:26:23.012 "read": true, 00:26:23.012 "write": true, 00:26:23.012 "unmap": true, 00:26:23.012 "write_zeroes": true, 00:26:23.012 "flush": true, 00:26:23.012 "reset": true, 00:26:23.012 "compare": false, 00:26:23.012 "compare_and_write": false, 00:26:23.012 "abort": true, 00:26:23.012 "nvme_admin": false, 00:26:23.012 "nvme_io": false 00:26:23.012 }, 00:26:23.012 "memory_domains": [ 00:26:23.012 { 00:26:23.012 "dma_device_id": "system", 00:26:23.012 "dma_device_type": 1 00:26:23.012 }, 00:26:23.012 { 00:26:23.012 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:23.012 "dma_device_type": 2 00:26:23.012 } 00:26:23.012 ], 00:26:23.012 "driver_specific": { 00:26:23.012 "passthru": { 00:26:23.012 "name": "pt2", 00:26:23.012 "base_bdev_name": "malloc2" 00:26:23.012 } 00:26:23.012 } 00:26:23.012 }' 00:26:23.012 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.012 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:23.012 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:23.012 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.271 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:23.530 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:23.530 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:23.530 16:05:28 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:26:23.530 [2024-06-10 16:05:29.029252] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:23.789 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=18fa153e-bf00-492c-8515-8271041dc284 00:26:23.789 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@435 -- # '[' -z 18fa153e-bf00-492c-8515-8271041dc284 ']' 00:26:23.789 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:23.789 [2024-06-10 16:05:29.281685] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:23.789 [2024-06-10 16:05:29.281703] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:23.790 [2024-06-10 16:05:29.281756] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:23.790 [2024-06-10 16:05:29.281807] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:23.790 [2024-06-10 16:05:29.281817] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe0f390 name raid_bdev1, state offline 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:24.049 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:24.307 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:26:24.307 16:05:29 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:24.566 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:26:24.566 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:26:24.825 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:26:25.085 [2024-06-10 16:05:30.553026] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:26:25.085 [2024-06-10 16:05:30.554451] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:26:25.085 [2024-06-10 16:05:30.554505] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:26:25.085 [2024-06-10 16:05:30.554542] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:26:25.085 [2024-06-10 16:05:30.554564] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:25.085 [2024-06-10 16:05:30.554573] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf90580 name raid_bdev1, state configuring 00:26:25.085 request: 00:26:25.085 { 00:26:25.085 "name": "raid_bdev1", 00:26:25.085 "raid_level": "raid1", 00:26:25.085 "base_bdevs": [ 00:26:25.085 "malloc1", 00:26:25.085 "malloc2" 00:26:25.085 ], 00:26:25.085 "superblock": false, 00:26:25.085 "method": "bdev_raid_create", 00:26:25.085 "req_id": 1 00:26:25.085 } 00:26:25.085 Got JSON-RPC error response 00:26:25.085 response: 00:26:25.085 { 00:26:25.085 "code": -17, 00:26:25.085 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:26:25.085 } 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@652 -- # es=1 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:26:25.085 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.344 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:26:25.344 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:26:25.344 16:05:30 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:25.603 [2024-06-10 16:05:31.054296] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:25.603 [2024-06-10 16:05:31.054334] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:25.603 [2024-06-10 16:05:31.054349] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0edf0 00:26:25.603 [2024-06-10 16:05:31.054359] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:25.603 [2024-06-10 16:05:31.055830] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:25.603 [2024-06-10 16:05:31.055854] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:25.603 [2024-06-10 16:05:31.055895] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:25.603 [2024-06-10 16:05:31.055917] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:25.603 pt1 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:25.603 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:25.862 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:25.862 "name": "raid_bdev1", 00:26:25.862 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:25.862 "strip_size_kb": 0, 00:26:25.862 "state": "configuring", 00:26:25.862 "raid_level": "raid1", 00:26:25.862 "superblock": true, 00:26:25.862 "num_base_bdevs": 2, 00:26:25.862 "num_base_bdevs_discovered": 1, 00:26:25.862 "num_base_bdevs_operational": 2, 00:26:25.862 "base_bdevs_list": [ 00:26:25.862 { 00:26:25.862 "name": "pt1", 00:26:25.862 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:25.862 "is_configured": true, 00:26:25.862 "data_offset": 256, 00:26:25.862 "data_size": 7936 00:26:25.862 }, 00:26:25.862 { 00:26:25.862 "name": null, 00:26:25.862 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:25.862 "is_configured": false, 00:26:25.862 "data_offset": 256, 00:26:25.862 "data_size": 7936 00:26:25.862 } 00:26:25.862 ] 00:26:25.862 }' 00:26:25.862 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:25.862 16:05:31 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:26.438 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:26:26.438 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:26:26.438 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:26.438 16:05:31 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:26.701 [2024-06-10 16:05:32.169350] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:26.701 [2024-06-10 16:05:32.169396] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:26.701 [2024-06-10 16:05:32.169411] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xe0cf80 00:26:26.701 [2024-06-10 16:05:32.169420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:26.701 [2024-06-10 16:05:32.169602] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:26.701 [2024-06-10 16:05:32.169615] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:26.701 [2024-06-10 16:05:32.169655] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:26.701 [2024-06-10 16:05:32.169671] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:26.701 [2024-06-10 16:05:32.169763] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf92610 00:26:26.701 [2024-06-10 16:05:32.169771] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:26.701 [2024-06-10 16:05:32.169824] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf939e0 00:26:26.701 [2024-06-10 16:05:32.169929] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf92610 00:26:26.701 [2024-06-10 16:05:32.169937] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf92610 00:26:26.701 [2024-06-10 16:05:32.170020] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:26.701 pt2 00:26:26.701 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:26:26.701 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:26.702 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:26.960 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:26.960 "name": "raid_bdev1", 00:26:26.960 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:26.960 "strip_size_kb": 0, 00:26:26.960 "state": "online", 00:26:26.960 "raid_level": "raid1", 00:26:26.960 "superblock": true, 00:26:26.960 "num_base_bdevs": 2, 00:26:26.960 "num_base_bdevs_discovered": 2, 00:26:26.960 "num_base_bdevs_operational": 2, 00:26:26.960 "base_bdevs_list": [ 00:26:26.960 { 00:26:26.960 "name": "pt1", 00:26:26.960 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:26.960 "is_configured": true, 00:26:26.960 "data_offset": 256, 00:26:26.960 "data_size": 7936 00:26:26.960 }, 00:26:26.960 { 00:26:26.960 "name": "pt2", 00:26:26.960 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:26.960 "is_configured": true, 00:26:26.960 "data_offset": 256, 00:26:26.960 "data_size": 7936 00:26:26.960 } 00:26:26.960 ] 00:26:26.960 }' 00:26:26.960 16:05:32 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:26.960 16:05:32 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@198 -- # local name 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:27.528 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:26:27.787 [2024-06-10 16:05:33.260520] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:27.787 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:26:27.787 "name": "raid_bdev1", 00:26:27.787 "aliases": [ 00:26:27.787 "18fa153e-bf00-492c-8515-8271041dc284" 00:26:27.787 ], 00:26:27.787 "product_name": "Raid Volume", 00:26:27.787 "block_size": 4096, 00:26:27.787 "num_blocks": 7936, 00:26:27.787 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:27.787 "md_size": 32, 00:26:27.787 "md_interleave": false, 00:26:27.787 "dif_type": 0, 00:26:27.787 "assigned_rate_limits": { 00:26:27.787 "rw_ios_per_sec": 0, 00:26:27.787 "rw_mbytes_per_sec": 0, 00:26:27.787 "r_mbytes_per_sec": 0, 00:26:27.787 "w_mbytes_per_sec": 0 00:26:27.787 }, 00:26:27.787 "claimed": false, 00:26:27.787 "zoned": false, 00:26:27.787 "supported_io_types": { 00:26:27.787 "read": true, 00:26:27.787 "write": true, 00:26:27.787 "unmap": false, 00:26:27.787 "write_zeroes": true, 00:26:27.787 "flush": false, 00:26:27.787 "reset": true, 00:26:27.787 "compare": false, 00:26:27.787 "compare_and_write": false, 00:26:27.787 "abort": false, 00:26:27.787 "nvme_admin": false, 00:26:27.787 "nvme_io": false 00:26:27.787 }, 00:26:27.787 "memory_domains": [ 00:26:27.787 { 00:26:27.787 "dma_device_id": "system", 00:26:27.787 "dma_device_type": 1 00:26:27.787 }, 00:26:27.787 { 00:26:27.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.787 "dma_device_type": 2 00:26:27.787 }, 00:26:27.787 { 00:26:27.787 "dma_device_id": "system", 00:26:27.787 "dma_device_type": 1 00:26:27.787 }, 00:26:27.787 { 00:26:27.787 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:27.787 "dma_device_type": 2 00:26:27.787 } 00:26:27.787 ], 00:26:27.787 "driver_specific": { 00:26:27.787 "raid": { 00:26:27.787 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:27.787 "strip_size_kb": 0, 00:26:27.787 "state": "online", 00:26:27.787 "raid_level": "raid1", 00:26:27.787 "superblock": true, 00:26:27.787 "num_base_bdevs": 2, 00:26:27.787 "num_base_bdevs_discovered": 2, 00:26:27.787 "num_base_bdevs_operational": 2, 00:26:27.787 "base_bdevs_list": [ 00:26:27.787 { 00:26:27.787 "name": "pt1", 00:26:27.787 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:27.787 "is_configured": true, 00:26:27.787 "data_offset": 256, 00:26:27.787 "data_size": 7936 00:26:27.787 }, 00:26:27.787 { 00:26:27.787 "name": "pt2", 00:26:27.787 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:27.787 "is_configured": true, 00:26:27.787 "data_offset": 256, 00:26:27.787 "data_size": 7936 00:26:27.787 } 00:26:27.787 ] 00:26:27.787 } 00:26:27.787 } 00:26:27.787 }' 00:26:27.787 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:26:28.046 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:26:28.046 pt2' 00:26:28.046 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:28.046 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:28.046 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:28.304 "name": "pt1", 00:26:28.304 "aliases": [ 00:26:28.304 "00000000-0000-0000-0000-000000000001" 00:26:28.304 ], 00:26:28.304 "product_name": "passthru", 00:26:28.304 "block_size": 4096, 00:26:28.304 "num_blocks": 8192, 00:26:28.304 "uuid": "00000000-0000-0000-0000-000000000001", 00:26:28.304 "md_size": 32, 00:26:28.304 "md_interleave": false, 00:26:28.304 "dif_type": 0, 00:26:28.304 "assigned_rate_limits": { 00:26:28.304 "rw_ios_per_sec": 0, 00:26:28.304 "rw_mbytes_per_sec": 0, 00:26:28.304 "r_mbytes_per_sec": 0, 00:26:28.304 "w_mbytes_per_sec": 0 00:26:28.304 }, 00:26:28.304 "claimed": true, 00:26:28.304 "claim_type": "exclusive_write", 00:26:28.304 "zoned": false, 00:26:28.304 "supported_io_types": { 00:26:28.304 "read": true, 00:26:28.304 "write": true, 00:26:28.304 "unmap": true, 00:26:28.304 "write_zeroes": true, 00:26:28.304 "flush": true, 00:26:28.304 "reset": true, 00:26:28.304 "compare": false, 00:26:28.304 "compare_and_write": false, 00:26:28.304 "abort": true, 00:26:28.304 "nvme_admin": false, 00:26:28.304 "nvme_io": false 00:26:28.304 }, 00:26:28.304 "memory_domains": [ 00:26:28.304 { 00:26:28.304 "dma_device_id": "system", 00:26:28.304 "dma_device_type": 1 00:26:28.304 }, 00:26:28.304 { 00:26:28.304 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.304 "dma_device_type": 2 00:26:28.304 } 00:26:28.304 ], 00:26:28.304 "driver_specific": { 00:26:28.304 "passthru": { 00:26:28.304 "name": "pt1", 00:26:28.304 "base_bdev_name": "malloc1" 00:26:28.304 } 00:26:28.304 } 00:26:28.304 }' 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:28.304 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:26:28.563 16:05:33 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:26:28.823 "name": "pt2", 00:26:28.823 "aliases": [ 00:26:28.823 "00000000-0000-0000-0000-000000000002" 00:26:28.823 ], 00:26:28.823 "product_name": "passthru", 00:26:28.823 "block_size": 4096, 00:26:28.823 "num_blocks": 8192, 00:26:28.823 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:28.823 "md_size": 32, 00:26:28.823 "md_interleave": false, 00:26:28.823 "dif_type": 0, 00:26:28.823 "assigned_rate_limits": { 00:26:28.823 "rw_ios_per_sec": 0, 00:26:28.823 "rw_mbytes_per_sec": 0, 00:26:28.823 "r_mbytes_per_sec": 0, 00:26:28.823 "w_mbytes_per_sec": 0 00:26:28.823 }, 00:26:28.823 "claimed": true, 00:26:28.823 "claim_type": "exclusive_write", 00:26:28.823 "zoned": false, 00:26:28.823 "supported_io_types": { 00:26:28.823 "read": true, 00:26:28.823 "write": true, 00:26:28.823 "unmap": true, 00:26:28.823 "write_zeroes": true, 00:26:28.823 "flush": true, 00:26:28.823 "reset": true, 00:26:28.823 "compare": false, 00:26:28.823 "compare_and_write": false, 00:26:28.823 "abort": true, 00:26:28.823 "nvme_admin": false, 00:26:28.823 "nvme_io": false 00:26:28.823 }, 00:26:28.823 "memory_domains": [ 00:26:28.823 { 00:26:28.823 "dma_device_id": "system", 00:26:28.823 "dma_device_type": 1 00:26:28.823 }, 00:26:28.823 { 00:26:28.823 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:26:28.823 "dma_device_type": 2 00:26:28.823 } 00:26:28.823 ], 00:26:28.823 "driver_specific": { 00:26:28.823 "passthru": { 00:26:28.823 "name": "pt2", 00:26:28.823 "base_bdev_name": "malloc2" 00:26:28.823 } 00:26:28.823 } 00:26:28.823 }' 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@205 -- # [[ 4096 == 4096 ]] 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:28.823 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@207 -- # [[ false == false ]] 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:26:29.106 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:29.107 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:26:29.382 [2024-06-10 16:05:34.796645] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:29.382 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@486 -- # '[' 18fa153e-bf00-492c-8515-8271041dc284 '!=' 18fa153e-bf00-492c-8515-8271041dc284 ']' 00:26:29.382 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:26:29.382 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@213 -- # case $1 in 00:26:29.382 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@214 -- # return 0 00:26:29.382 16:05:34 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:26:29.641 [2024-06-10 16:05:35.053125] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:29.641 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:29.900 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:29.900 "name": "raid_bdev1", 00:26:29.900 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:29.900 "strip_size_kb": 0, 00:26:29.900 "state": "online", 00:26:29.900 "raid_level": "raid1", 00:26:29.900 "superblock": true, 00:26:29.900 "num_base_bdevs": 2, 00:26:29.900 "num_base_bdevs_discovered": 1, 00:26:29.900 "num_base_bdevs_operational": 1, 00:26:29.900 "base_bdevs_list": [ 00:26:29.900 { 00:26:29.900 "name": null, 00:26:29.900 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:29.900 "is_configured": false, 00:26:29.900 "data_offset": 256, 00:26:29.900 "data_size": 7936 00:26:29.900 }, 00:26:29.900 { 00:26:29.900 "name": "pt2", 00:26:29.900 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:29.900 "is_configured": true, 00:26:29.900 "data_offset": 256, 00:26:29.900 "data_size": 7936 00:26:29.900 } 00:26:29.900 ] 00:26:29.900 }' 00:26:29.900 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:29.900 16:05:35 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:30.468 16:05:35 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:30.727 [2024-06-10 16:05:36.188141] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:30.727 [2024-06-10 16:05:36.188165] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:30.727 [2024-06-10 16:05:36.188215] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:30.727 [2024-06-10 16:05:36.188260] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:30.727 [2024-06-10 16:05:36.188270] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf92610 name raid_bdev1, state offline 00:26:30.727 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:30.727 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:26:30.986 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:26:30.986 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:26:30.986 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:26:30.986 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:30.986 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@518 -- # i=1 00:26:31.245 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:26:31.504 [2024-06-10 16:05:36.962193] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:26:31.504 [2024-06-10 16:05:36.962232] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:31.504 [2024-06-10 16:05:36.962250] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8fa70 00:26:31.504 [2024-06-10 16:05:36.962259] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:31.504 [2024-06-10 16:05:36.963769] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:31.504 [2024-06-10 16:05:36.963794] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:26:31.504 [2024-06-10 16:05:36.963837] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:26:31.504 [2024-06-10 16:05:36.963859] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:31.504 [2024-06-10 16:05:36.963935] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf92d50 00:26:31.504 [2024-06-10 16:05:36.963943] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:31.504 [2024-06-10 16:05:36.964008] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe0de30 00:26:31.504 [2024-06-10 16:05:36.964108] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf92d50 00:26:31.504 [2024-06-10 16:05:36.964116] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf92d50 00:26:31.504 [2024-06-10 16:05:36.964181] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:31.504 pt2 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:31.504 16:05:36 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:31.763 16:05:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:31.763 "name": "raid_bdev1", 00:26:31.763 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:31.763 "strip_size_kb": 0, 00:26:31.763 "state": "online", 00:26:31.763 "raid_level": "raid1", 00:26:31.763 "superblock": true, 00:26:31.763 "num_base_bdevs": 2, 00:26:31.764 "num_base_bdevs_discovered": 1, 00:26:31.764 "num_base_bdevs_operational": 1, 00:26:31.764 "base_bdevs_list": [ 00:26:31.764 { 00:26:31.764 "name": null, 00:26:31.764 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:31.764 "is_configured": false, 00:26:31.764 "data_offset": 256, 00:26:31.764 "data_size": 7936 00:26:31.764 }, 00:26:31.764 { 00:26:31.764 "name": "pt2", 00:26:31.764 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:31.764 "is_configured": true, 00:26:31.764 "data_offset": 256, 00:26:31.764 "data_size": 7936 00:26:31.764 } 00:26:31.764 ] 00:26:31.764 }' 00:26:31.764 16:05:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:31.764 16:05:37 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:32.699 16:05:37 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:32.699 [2024-06-10 16:05:38.113272] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:32.699 [2024-06-10 16:05:38.113297] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:32.700 [2024-06-10 16:05:38.113349] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:32.700 [2024-06-10 16:05:38.113393] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:32.700 [2024-06-10 16:05:38.113407] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf92d50 name raid_bdev1, state offline 00:26:32.700 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:32.700 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:26:32.958 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:26:32.958 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:26:32.958 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:26:32.958 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:26:33.218 [2024-06-10 16:05:38.618595] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:26:33.218 [2024-06-10 16:05:38.618640] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:33.218 [2024-06-10 16:05:38.618654] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xf8f1e0 00:26:33.218 [2024-06-10 16:05:38.618664] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:33.218 [2024-06-10 16:05:38.620187] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:33.218 [2024-06-10 16:05:38.620212] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:26:33.218 [2024-06-10 16:05:38.620255] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:26:33.218 [2024-06-10 16:05:38.620277] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:26:33.218 [2024-06-10 16:05:38.620373] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:26:33.218 [2024-06-10 16:05:38.620384] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:33.218 [2024-06-10 16:05:38.620396] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf93a30 name raid_bdev1, state configuring 00:26:33.218 [2024-06-10 16:05:38.620417] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:26:33.218 [2024-06-10 16:05:38.620468] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xf93a30 00:26:33.218 [2024-06-10 16:05:38.620476] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:33.218 [2024-06-10 16:05:38.620531] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xf94230 00:26:33.218 [2024-06-10 16:05:38.620631] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xf93a30 00:26:33.218 [2024-06-10 16:05:38.620639] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xf93a30 00:26:33.218 [2024-06-10 16:05:38.620714] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:33.218 pt1 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:33.218 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:33.477 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:33.477 "name": "raid_bdev1", 00:26:33.477 "uuid": "18fa153e-bf00-492c-8515-8271041dc284", 00:26:33.477 "strip_size_kb": 0, 00:26:33.477 "state": "online", 00:26:33.477 "raid_level": "raid1", 00:26:33.477 "superblock": true, 00:26:33.477 "num_base_bdevs": 2, 00:26:33.477 "num_base_bdevs_discovered": 1, 00:26:33.477 "num_base_bdevs_operational": 1, 00:26:33.477 "base_bdevs_list": [ 00:26:33.477 { 00:26:33.477 "name": null, 00:26:33.477 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:33.477 "is_configured": false, 00:26:33.477 "data_offset": 256, 00:26:33.477 "data_size": 7936 00:26:33.477 }, 00:26:33.477 { 00:26:33.477 "name": "pt2", 00:26:33.477 "uuid": "00000000-0000-0000-0000-000000000002", 00:26:33.477 "is_configured": true, 00:26:33.477 "data_offset": 256, 00:26:33.477 "data_size": 7936 00:26:33.477 } 00:26:33.477 ] 00:26:33.477 }' 00:26:33.477 16:05:38 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:33.477 16:05:38 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:34.045 16:05:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:26:34.045 16:05:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:26:34.304 16:05:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:26:34.304 16:05:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:34.304 16:05:39 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:26:34.564 [2024-06-10 16:05:39.998639] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@557 -- # '[' 18fa153e-bf00-492c-8515-8271041dc284 '!=' 18fa153e-bf00-492c-8515-8271041dc284 ']' 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@562 -- # killprocess 2814436 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@949 -- # '[' -z 2814436 ']' 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@953 -- # kill -0 2814436 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # uname 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2814436 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2814436' 00:26:34.564 killing process with pid 2814436 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@968 -- # kill 2814436 00:26:34.564 [2024-06-10 16:05:40.064643] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:26:34.564 [2024-06-10 16:05:40.064697] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:34.564 [2024-06-10 16:05:40.064739] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:34.564 [2024-06-10 16:05:40.064748] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xf93a30 name raid_bdev1, state offline 00:26:34.564 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@973 -- # wait 2814436 00:26:34.823 [2024-06-10 16:05:40.084582] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:26:34.823 16:05:40 bdev_raid.raid_superblock_test_md_separate -- bdev/bdev_raid.sh@564 -- # return 0 00:26:34.823 00:26:34.823 real 0m16.242s 00:26:34.823 user 0m30.161s 00:26:34.823 sys 0m2.292s 00:26:34.823 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:26:34.823 16:05:40 bdev_raid.raid_superblock_test_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:34.823 ************************************ 00:26:34.823 END TEST raid_superblock_test_md_separate 00:26:34.823 ************************************ 00:26:34.823 16:05:40 bdev_raid -- bdev/bdev_raid.sh@907 -- # '[' true = true ']' 00:26:34.823 16:05:40 bdev_raid -- bdev/bdev_raid.sh@908 -- # run_test raid_rebuild_test_sb_md_separate raid_rebuild_test raid1 2 true false true 00:26:34.823 16:05:40 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:26:34.823 16:05:40 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:26:34.823 16:05:40 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:26:35.083 ************************************ 00:26:35.083 START TEST raid_rebuild_test_sb_md_separate 00:26:35.083 ************************************ 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false true 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@572 -- # local verify=true 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@575 -- # local strip_size 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@576 -- # local create_arg 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@578 -- # local data_offset 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@596 -- # raid_pid=2817278 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@597 -- # waitforlisten 2817278 /var/tmp/spdk-raid.sock 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@830 -- # '[' -z 2817278 ']' 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@835 -- # local max_retries=100 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:26:35.083 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@839 -- # xtrace_disable 00:26:35.083 16:05:40 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:35.083 [2024-06-10 16:05:40.418423] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:26:35.083 [2024-06-10 16:05:40.418476] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2817278 ] 00:26:35.083 I/O size of 3145728 is greater than zero copy threshold (65536). 00:26:35.083 Zero copy mechanism will not be used. 00:26:35.083 [2024-06-10 16:05:40.519164] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:26:35.342 [2024-06-10 16:05:40.614756] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:26:35.342 [2024-06-10 16:05:40.677533] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:35.342 [2024-06-10 16:05:40.677566] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:26:35.910 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:26:35.910 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@863 -- # return 0 00:26:35.910 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:35.910 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev1_malloc 00:26:36.170 BaseBdev1_malloc 00:26:36.170 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:26:36.429 [2024-06-10 16:05:41.880242] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:26:36.429 [2024-06-10 16:05:41.880286] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.429 [2024-06-10 16:05:41.880307] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2681890 00:26:36.429 [2024-06-10 16:05:41.880318] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.429 [2024-06-10 16:05:41.881806] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.429 [2024-06-10 16:05:41.881833] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:26:36.429 BaseBdev1 00:26:36.429 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:26:36.429 16:05:41 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b BaseBdev2_malloc 00:26:36.689 BaseBdev2_malloc 00:26:36.689 16:05:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:26:36.955 [2024-06-10 16:05:42.403068] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:26:36.955 [2024-06-10 16:05:42.403109] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:36.955 [2024-06-10 16:05:42.403133] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27d86b0 00:26:36.955 [2024-06-10 16:05:42.403144] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:36.955 [2024-06-10 16:05:42.404627] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:36.955 [2024-06-10 16:05:42.404652] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:26:36.955 BaseBdev2 00:26:36.955 16:05:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -b spare_malloc 00:26:37.213 spare_malloc 00:26:37.213 16:05:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:26:37.473 spare_delay 00:26:37.473 16:05:42 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:37.732 [2024-06-10 16:05:43.170351] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:37.732 [2024-06-10 16:05:43.170391] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:37.732 [2024-06-10 16:05:43.170410] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x27c48a0 00:26:37.732 [2024-06-10 16:05:43.170420] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:37.732 [2024-06-10 16:05:43.171855] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:37.732 [2024-06-10 16:05:43.171881] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:37.732 spare 00:26:37.732 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:26:37.991 [2024-06-10 16:05:43.423055] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:26:37.991 [2024-06-10 16:05:43.424423] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:37.991 [2024-06-10 16:05:43.424582] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x27c5ea0 00:26:37.991 [2024-06-10 16:05:43.424595] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:37.991 [2024-06-10 16:05:43.424661] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x267f9e0 00:26:37.991 [2024-06-10 16:05:43.424776] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x27c5ea0 00:26:37.991 [2024-06-10 16:05:43.424785] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x27c5ea0 00:26:37.991 [2024-06-10 16:05:43.424852] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:37.991 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:38.251 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:38.251 "name": "raid_bdev1", 00:26:38.251 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:38.251 "strip_size_kb": 0, 00:26:38.251 "state": "online", 00:26:38.251 "raid_level": "raid1", 00:26:38.251 "superblock": true, 00:26:38.251 "num_base_bdevs": 2, 00:26:38.251 "num_base_bdevs_discovered": 2, 00:26:38.251 "num_base_bdevs_operational": 2, 00:26:38.251 "base_bdevs_list": [ 00:26:38.251 { 00:26:38.251 "name": "BaseBdev1", 00:26:38.251 "uuid": "5fe16da4-7d01-5e92-b2b1-7cc9bf00cede", 00:26:38.251 "is_configured": true, 00:26:38.251 "data_offset": 256, 00:26:38.251 "data_size": 7936 00:26:38.251 }, 00:26:38.251 { 00:26:38.251 "name": "BaseBdev2", 00:26:38.251 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:38.251 "is_configured": true, 00:26:38.251 "data_offset": 256, 00:26:38.251 "data_size": 7936 00:26:38.251 } 00:26:38.251 ] 00:26:38.251 }' 00:26:38.251 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:38.251 16:05:43 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:39.187 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:26:39.187 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:26:39.187 [2024-06-10 16:05:44.562327] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:26:39.187 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:26:39.187 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:39.187 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@623 -- # '[' true = true ']' 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@624 -- # local write_unit_size 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@627 -- # nbd_start_disks /var/tmp/spdk-raid.sock raid_bdev1 /dev/nbd0 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('raid_bdev1') 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0') 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:39.447 16:05:44 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk raid_bdev1 /dev/nbd0 00:26:39.707 [2024-06-10 16:05:45.075526] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c6f40 00:26:39.707 /dev/nbd0 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:39.707 1+0 records in 00:26:39.707 1+0 records out 00:26:39.707 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000228289 s, 17.9 MB/s 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 1 )) 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@628 -- # '[' raid1 = raid5f ']' 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@632 -- # write_unit_size=1 00:26:39.707 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@634 -- # dd if=/dev/urandom of=/dev/nbd0 bs=4096 count=7936 oflag=direct 00:26:40.643 7936+0 records in 00:26:40.643 7936+0 records out 00:26:40.643 32505856 bytes (33 MB, 31 MiB) copied, 0.761915 s, 42.7 MB/s 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@635 -- # nbd_stop_disks /var/tmp/spdk-raid.sock /dev/nbd0 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:40.643 16:05:45 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:40.902 [2024-06-10 16:05:46.167672] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:40.902 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:26:40.902 [2024-06-10 16:05:46.404357] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:41.161 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:41.420 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:41.420 "name": "raid_bdev1", 00:26:41.420 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:41.420 "strip_size_kb": 0, 00:26:41.420 "state": "online", 00:26:41.420 "raid_level": "raid1", 00:26:41.420 "superblock": true, 00:26:41.420 "num_base_bdevs": 2, 00:26:41.420 "num_base_bdevs_discovered": 1, 00:26:41.420 "num_base_bdevs_operational": 1, 00:26:41.420 "base_bdevs_list": [ 00:26:41.420 { 00:26:41.420 "name": null, 00:26:41.420 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:41.420 "is_configured": false, 00:26:41.420 "data_offset": 256, 00:26:41.420 "data_size": 7936 00:26:41.420 }, 00:26:41.420 { 00:26:41.420 "name": "BaseBdev2", 00:26:41.420 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:41.420 "is_configured": true, 00:26:41.420 "data_offset": 256, 00:26:41.420 "data_size": 7936 00:26:41.420 } 00:26:41.420 ] 00:26:41.420 }' 00:26:41.420 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:41.420 16:05:46 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:41.986 16:05:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:42.243 [2024-06-10 16:05:47.535402] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:42.243 [2024-06-10 16:05:47.537640] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c6d10 00:26:42.243 [2024-06-10 16:05:47.539693] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:42.243 16:05:47 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@646 -- # sleep 1 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.179 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:43.438 "name": "raid_bdev1", 00:26:43.438 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:43.438 "strip_size_kb": 0, 00:26:43.438 "state": "online", 00:26:43.438 "raid_level": "raid1", 00:26:43.438 "superblock": true, 00:26:43.438 "num_base_bdevs": 2, 00:26:43.438 "num_base_bdevs_discovered": 2, 00:26:43.438 "num_base_bdevs_operational": 2, 00:26:43.438 "process": { 00:26:43.438 "type": "rebuild", 00:26:43.438 "target": "spare", 00:26:43.438 "progress": { 00:26:43.438 "blocks": 3072, 00:26:43.438 "percent": 38 00:26:43.438 } 00:26:43.438 }, 00:26:43.438 "base_bdevs_list": [ 00:26:43.438 { 00:26:43.438 "name": "spare", 00:26:43.438 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:43.438 "is_configured": true, 00:26:43.438 "data_offset": 256, 00:26:43.438 "data_size": 7936 00:26:43.438 }, 00:26:43.438 { 00:26:43.438 "name": "BaseBdev2", 00:26:43.438 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:43.438 "is_configured": true, 00:26:43.438 "data_offset": 256, 00:26:43.438 "data_size": 7936 00:26:43.438 } 00:26:43.438 ] 00:26:43.438 }' 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:43.438 16:05:48 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:43.698 [2024-06-10 16:05:49.069628] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:43.698 [2024-06-10 16:05:49.152164] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:43.698 [2024-06-10 16:05:49.152209] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:43.698 [2024-06-10 16:05:49.152223] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:43.698 [2024-06-10 16:05:49.152229] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:43.698 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.013 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:44.013 "name": "raid_bdev1", 00:26:44.013 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:44.013 "strip_size_kb": 0, 00:26:44.013 "state": "online", 00:26:44.013 "raid_level": "raid1", 00:26:44.013 "superblock": true, 00:26:44.013 "num_base_bdevs": 2, 00:26:44.013 "num_base_bdevs_discovered": 1, 00:26:44.013 "num_base_bdevs_operational": 1, 00:26:44.013 "base_bdevs_list": [ 00:26:44.013 { 00:26:44.013 "name": null, 00:26:44.013 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.013 "is_configured": false, 00:26:44.013 "data_offset": 256, 00:26:44.013 "data_size": 7936 00:26:44.013 }, 00:26:44.013 { 00:26:44.013 "name": "BaseBdev2", 00:26:44.013 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:44.013 "is_configured": true, 00:26:44.013 "data_offset": 256, 00:26:44.013 "data_size": 7936 00:26:44.013 } 00:26:44.013 ] 00:26:44.013 }' 00:26:44.013 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:44.013 16:05:49 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:44.579 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:44.579 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:44.579 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:44.579 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:44.579 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:44.580 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:44.580 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:44.838 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:44.838 "name": "raid_bdev1", 00:26:44.838 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:44.838 "strip_size_kb": 0, 00:26:44.838 "state": "online", 00:26:44.838 "raid_level": "raid1", 00:26:44.838 "superblock": true, 00:26:44.838 "num_base_bdevs": 2, 00:26:44.838 "num_base_bdevs_discovered": 1, 00:26:44.838 "num_base_bdevs_operational": 1, 00:26:44.838 "base_bdevs_list": [ 00:26:44.838 { 00:26:44.838 "name": null, 00:26:44.839 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:44.839 "is_configured": false, 00:26:44.839 "data_offset": 256, 00:26:44.839 "data_size": 7936 00:26:44.839 }, 00:26:44.839 { 00:26:44.839 "name": "BaseBdev2", 00:26:44.839 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:44.839 "is_configured": true, 00:26:44.839 "data_offset": 256, 00:26:44.839 "data_size": 7936 00:26:44.839 } 00:26:44.839 ] 00:26:44.839 }' 00:26:44.839 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:45.097 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:45.097 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:45.097 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:45.097 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:45.355 [2024-06-10 16:05:50.639243] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:45.355 [2024-06-10 16:05:50.641499] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c9310 00:26:45.355 [2024-06-10 16:05:50.643017] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:45.355 16:05:50 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@662 -- # sleep 1 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.289 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.548 "name": "raid_bdev1", 00:26:46.548 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:46.548 "strip_size_kb": 0, 00:26:46.548 "state": "online", 00:26:46.548 "raid_level": "raid1", 00:26:46.548 "superblock": true, 00:26:46.548 "num_base_bdevs": 2, 00:26:46.548 "num_base_bdevs_discovered": 2, 00:26:46.548 "num_base_bdevs_operational": 2, 00:26:46.548 "process": { 00:26:46.548 "type": "rebuild", 00:26:46.548 "target": "spare", 00:26:46.548 "progress": { 00:26:46.548 "blocks": 3072, 00:26:46.548 "percent": 38 00:26:46.548 } 00:26:46.548 }, 00:26:46.548 "base_bdevs_list": [ 00:26:46.548 { 00:26:46.548 "name": "spare", 00:26:46.548 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:46.548 "is_configured": true, 00:26:46.548 "data_offset": 256, 00:26:46.548 "data_size": 7936 00:26:46.548 }, 00:26:46.548 { 00:26:46.548 "name": "BaseBdev2", 00:26:46.548 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:46.548 "is_configured": true, 00:26:46.548 "data_offset": 256, 00:26:46.548 "data_size": 7936 00:26:46.548 } 00:26:46.548 ] 00:26:46.548 }' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:26:46.548 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@705 -- # local timeout=1073 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:46.548 16:05:51 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:46.805 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:46.805 "name": "raid_bdev1", 00:26:46.805 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:46.805 "strip_size_kb": 0, 00:26:46.805 "state": "online", 00:26:46.805 "raid_level": "raid1", 00:26:46.805 "superblock": true, 00:26:46.805 "num_base_bdevs": 2, 00:26:46.805 "num_base_bdevs_discovered": 2, 00:26:46.805 "num_base_bdevs_operational": 2, 00:26:46.805 "process": { 00:26:46.805 "type": "rebuild", 00:26:46.805 "target": "spare", 00:26:46.805 "progress": { 00:26:46.805 "blocks": 3840, 00:26:46.805 "percent": 48 00:26:46.805 } 00:26:46.805 }, 00:26:46.805 "base_bdevs_list": [ 00:26:46.805 { 00:26:46.805 "name": "spare", 00:26:46.805 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:46.805 "is_configured": true, 00:26:46.805 "data_offset": 256, 00:26:46.805 "data_size": 7936 00:26:46.805 }, 00:26:46.805 { 00:26:46.805 "name": "BaseBdev2", 00:26:46.805 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:46.805 "is_configured": true, 00:26:46.805 "data_offset": 256, 00:26:46.805 "data_size": 7936 00:26:46.805 } 00:26:46.805 ] 00:26:46.805 }' 00:26:46.805 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:46.805 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:46.805 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:47.063 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:47.063 16:05:52 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:47.998 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:48.257 "name": "raid_bdev1", 00:26:48.257 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:48.257 "strip_size_kb": 0, 00:26:48.257 "state": "online", 00:26:48.257 "raid_level": "raid1", 00:26:48.257 "superblock": true, 00:26:48.257 "num_base_bdevs": 2, 00:26:48.257 "num_base_bdevs_discovered": 2, 00:26:48.257 "num_base_bdevs_operational": 2, 00:26:48.257 "process": { 00:26:48.257 "type": "rebuild", 00:26:48.257 "target": "spare", 00:26:48.257 "progress": { 00:26:48.257 "blocks": 7168, 00:26:48.257 "percent": 90 00:26:48.257 } 00:26:48.257 }, 00:26:48.257 "base_bdevs_list": [ 00:26:48.257 { 00:26:48.257 "name": "spare", 00:26:48.257 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:48.257 "is_configured": true, 00:26:48.257 "data_offset": 256, 00:26:48.257 "data_size": 7936 00:26:48.257 }, 00:26:48.257 { 00:26:48.257 "name": "BaseBdev2", 00:26:48.257 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:48.257 "is_configured": true, 00:26:48.257 "data_offset": 256, 00:26:48.257 "data_size": 7936 00:26:48.257 } 00:26:48.257 ] 00:26:48.257 }' 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:48.257 16:05:53 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@710 -- # sleep 1 00:26:48.516 [2024-06-10 16:05:53.766878] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:26:48.516 [2024-06-10 16:05:53.766935] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:26:48.516 [2024-06-10 16:05:53.767024] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:49.452 "name": "raid_bdev1", 00:26:49.452 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:49.452 "strip_size_kb": 0, 00:26:49.452 "state": "online", 00:26:49.452 "raid_level": "raid1", 00:26:49.452 "superblock": true, 00:26:49.452 "num_base_bdevs": 2, 00:26:49.452 "num_base_bdevs_discovered": 2, 00:26:49.452 "num_base_bdevs_operational": 2, 00:26:49.452 "base_bdevs_list": [ 00:26:49.452 { 00:26:49.452 "name": "spare", 00:26:49.452 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:49.452 "is_configured": true, 00:26:49.452 "data_offset": 256, 00:26:49.452 "data_size": 7936 00:26:49.452 }, 00:26:49.452 { 00:26:49.452 "name": "BaseBdev2", 00:26:49.452 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:49.452 "is_configured": true, 00:26:49.452 "data_offset": 256, 00:26:49.452 "data_size": 7936 00:26:49.452 } 00:26:49.452 ] 00:26:49.452 }' 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:26:49.452 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@708 -- # break 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:49.711 16:05:54 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:49.969 "name": "raid_bdev1", 00:26:49.969 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:49.969 "strip_size_kb": 0, 00:26:49.969 "state": "online", 00:26:49.969 "raid_level": "raid1", 00:26:49.969 "superblock": true, 00:26:49.969 "num_base_bdevs": 2, 00:26:49.969 "num_base_bdevs_discovered": 2, 00:26:49.969 "num_base_bdevs_operational": 2, 00:26:49.969 "base_bdevs_list": [ 00:26:49.969 { 00:26:49.969 "name": "spare", 00:26:49.969 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:49.969 "is_configured": true, 00:26:49.969 "data_offset": 256, 00:26:49.969 "data_size": 7936 00:26:49.969 }, 00:26:49.969 { 00:26:49.969 "name": "BaseBdev2", 00:26:49.969 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:49.969 "is_configured": true, 00:26:49.969 "data_offset": 256, 00:26:49.969 "data_size": 7936 00:26:49.969 } 00:26:49.969 ] 00:26:49.969 }' 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:49.969 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:50.228 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:50.228 "name": "raid_bdev1", 00:26:50.228 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:50.228 "strip_size_kb": 0, 00:26:50.228 "state": "online", 00:26:50.228 "raid_level": "raid1", 00:26:50.228 "superblock": true, 00:26:50.228 "num_base_bdevs": 2, 00:26:50.228 "num_base_bdevs_discovered": 2, 00:26:50.228 "num_base_bdevs_operational": 2, 00:26:50.228 "base_bdevs_list": [ 00:26:50.228 { 00:26:50.228 "name": "spare", 00:26:50.228 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:50.228 "is_configured": true, 00:26:50.228 "data_offset": 256, 00:26:50.228 "data_size": 7936 00:26:50.228 }, 00:26:50.228 { 00:26:50.228 "name": "BaseBdev2", 00:26:50.228 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:50.228 "is_configured": true, 00:26:50.228 "data_offset": 256, 00:26:50.228 "data_size": 7936 00:26:50.228 } 00:26:50.228 ] 00:26:50.228 }' 00:26:50.228 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:50.228 16:05:55 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:50.795 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:26:51.054 [2024-06-10 16:05:56.349404] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:26:51.054 [2024-06-10 16:05:56.349427] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:26:51.054 [2024-06-10 16:05:56.349482] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:26:51.054 [2024-06-10 16:05:56.349542] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:26:51.054 [2024-06-10 16:05:56.349551] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x27c5ea0 name raid_bdev1, state offline 00:26:51.054 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:51.054 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # jq length 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@721 -- # '[' true = true ']' 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@722 -- # '[' false = true ']' 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@736 -- # nbd_start_disks /var/tmp/spdk-raid.sock 'BaseBdev1 spare' '/dev/nbd0 /dev/nbd1' 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # bdev_list=('BaseBdev1' 'spare') 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@10 -- # local bdev_list 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@11 -- # local nbd_list 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@12 -- # local i 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:51.313 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk BaseBdev1 /dev/nbd0 00:26:51.571 /dev/nbd0 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:51.571 1+0 records in 00:26:51.571 1+0 records out 00:26:51.571 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000239918 s, 17.1 MB/s 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:51.571 16:05:56 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_start_disk spare /dev/nbd1 00:26:51.830 /dev/nbd1 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@868 -- # local i 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@872 -- # break 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:26:51.830 1+0 records in 00:26:51.830 1+0 records out 00:26:51.830 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000250782 s, 16.3 MB/s 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@885 -- # size=4096 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@888 -- # return 0 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@737 -- # cmp -i 1048576 /dev/nbd0 /dev/nbd1 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@738 -- # nbd_stop_disks /var/tmp/spdk-raid.sock '/dev/nbd0 /dev/nbd1' 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-raid.sock 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@50 -- # local nbd_list 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@51 -- # local i 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:51.830 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd0 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:26:52.088 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock nbd_stop_disk /dev/nbd1 00:26:52.346 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:26:52.346 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:26:52.346 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:26:52.346 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@41 -- # break 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/nbd_common.sh@45 -- # return 0 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:26:52.347 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:52.605 16:05:57 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:52.864 [2024-06-10 16:05:58.208437] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:52.864 [2024-06-10 16:05:58.208478] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:52.864 [2024-06-10 16:05:58.208501] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x26804c0 00:26:52.864 [2024-06-10 16:05:58.208511] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:52.864 [2024-06-10 16:05:58.210147] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:52.864 [2024-06-10 16:05:58.210174] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:52.864 [2024-06-10 16:05:58.210228] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:52.864 [2024-06-10 16:05:58.210251] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:52.864 [2024-06-10 16:05:58.210346] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:26:52.864 spare 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:52.864 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:52.865 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:52.865 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:52.865 [2024-06-10 16:05:58.310653] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x281b2b0 00:26:52.865 [2024-06-10 16:05:58.310665] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4096 00:26:52.865 [2024-06-10 16:05:58.310726] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x281aeb0 00:26:52.865 [2024-06-10 16:05:58.310843] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x281b2b0 00:26:52.865 [2024-06-10 16:05:58.310851] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x281b2b0 00:26:52.865 [2024-06-10 16:05:58.310929] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:53.124 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:53.124 "name": "raid_bdev1", 00:26:53.124 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:53.124 "strip_size_kb": 0, 00:26:53.124 "state": "online", 00:26:53.124 "raid_level": "raid1", 00:26:53.124 "superblock": true, 00:26:53.124 "num_base_bdevs": 2, 00:26:53.124 "num_base_bdevs_discovered": 2, 00:26:53.124 "num_base_bdevs_operational": 2, 00:26:53.124 "base_bdevs_list": [ 00:26:53.124 { 00:26:53.124 "name": "spare", 00:26:53.124 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:53.124 "is_configured": true, 00:26:53.124 "data_offset": 256, 00:26:53.124 "data_size": 7936 00:26:53.124 }, 00:26:53.124 { 00:26:53.124 "name": "BaseBdev2", 00:26:53.124 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:53.124 "is_configured": true, 00:26:53.124 "data_offset": 256, 00:26:53.124 "data_size": 7936 00:26:53.124 } 00:26:53.124 ] 00:26:53.124 }' 00:26:53.124 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:53.124 16:05:58 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:53.692 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:53.953 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:53.953 "name": "raid_bdev1", 00:26:53.953 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:53.953 "strip_size_kb": 0, 00:26:53.953 "state": "online", 00:26:53.953 "raid_level": "raid1", 00:26:53.953 "superblock": true, 00:26:53.953 "num_base_bdevs": 2, 00:26:53.953 "num_base_bdevs_discovered": 2, 00:26:53.953 "num_base_bdevs_operational": 2, 00:26:53.953 "base_bdevs_list": [ 00:26:53.953 { 00:26:53.953 "name": "spare", 00:26:53.953 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:53.953 "is_configured": true, 00:26:53.953 "data_offset": 256, 00:26:53.953 "data_size": 7936 00:26:53.953 }, 00:26:53.953 { 00:26:53.953 "name": "BaseBdev2", 00:26:53.953 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:53.953 "is_configured": true, 00:26:53.953 "data_offset": 256, 00:26:53.953 "data_size": 7936 00:26:53.953 } 00:26:53.953 ] 00:26:53.953 }' 00:26:53.953 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:53.953 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:26:53.953 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:54.212 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:26:54.212 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.212 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:26:54.470 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:26:54.471 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:26:54.471 [2024-06-10 16:05:59.961318] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:54.471 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:54.471 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:54.471 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:54.729 16:05:59 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:54.988 16:06:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:54.988 "name": "raid_bdev1", 00:26:54.988 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:54.988 "strip_size_kb": 0, 00:26:54.988 "state": "online", 00:26:54.988 "raid_level": "raid1", 00:26:54.988 "superblock": true, 00:26:54.988 "num_base_bdevs": 2, 00:26:54.988 "num_base_bdevs_discovered": 1, 00:26:54.988 "num_base_bdevs_operational": 1, 00:26:54.988 "base_bdevs_list": [ 00:26:54.988 { 00:26:54.988 "name": null, 00:26:54.988 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:54.988 "is_configured": false, 00:26:54.988 "data_offset": 256, 00:26:54.988 "data_size": 7936 00:26:54.988 }, 00:26:54.988 { 00:26:54.988 "name": "BaseBdev2", 00:26:54.988 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:54.988 "is_configured": true, 00:26:54.988 "data_offset": 256, 00:26:54.988 "data_size": 7936 00:26:54.988 } 00:26:54.988 ] 00:26:54.988 }' 00:26:54.988 16:06:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:54.988 16:06:00 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:55.556 16:06:00 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:26:55.815 [2024-06-10 16:06:01.108393] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.815 [2024-06-10 16:06:01.108541] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:55.815 [2024-06-10 16:06:01.108556] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:55.815 [2024-06-10 16:06:01.108582] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:55.815 [2024-06-10 16:06:01.110677] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27c9310 00:26:55.815 [2024-06-10 16:06:01.112638] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:55.815 16:06:01 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@755 -- # sleep 1 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:56.751 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:57.013 "name": "raid_bdev1", 00:26:57.013 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:57.013 "strip_size_kb": 0, 00:26:57.013 "state": "online", 00:26:57.013 "raid_level": "raid1", 00:26:57.013 "superblock": true, 00:26:57.013 "num_base_bdevs": 2, 00:26:57.013 "num_base_bdevs_discovered": 2, 00:26:57.013 "num_base_bdevs_operational": 2, 00:26:57.013 "process": { 00:26:57.013 "type": "rebuild", 00:26:57.013 "target": "spare", 00:26:57.013 "progress": { 00:26:57.013 "blocks": 3072, 00:26:57.013 "percent": 38 00:26:57.013 } 00:26:57.013 }, 00:26:57.013 "base_bdevs_list": [ 00:26:57.013 { 00:26:57.013 "name": "spare", 00:26:57.013 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:57.013 "is_configured": true, 00:26:57.013 "data_offset": 256, 00:26:57.013 "data_size": 7936 00:26:57.013 }, 00:26:57.013 { 00:26:57.013 "name": "BaseBdev2", 00:26:57.013 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:57.013 "is_configured": true, 00:26:57.013 "data_offset": 256, 00:26:57.013 "data_size": 7936 00:26:57.013 } 00:26:57.013 ] 00:26:57.013 }' 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:57.013 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:57.272 [2024-06-10 16:06:02.710279] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.272 [2024-06-10 16:06:02.725053] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:26:57.272 [2024-06-10 16:06:02.725095] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:26:57.272 [2024-06-10 16:06:02.725108] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:26:57.272 [2024-06-10 16:06:02.725114] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:26:57.272 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:26:57.272 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:26:57.272 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:26:57.272 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:26:57.272 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:57.273 16:06:02 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:57.531 16:06:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:26:57.531 "name": "raid_bdev1", 00:26:57.532 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:57.532 "strip_size_kb": 0, 00:26:57.532 "state": "online", 00:26:57.532 "raid_level": "raid1", 00:26:57.532 "superblock": true, 00:26:57.532 "num_base_bdevs": 2, 00:26:57.532 "num_base_bdevs_discovered": 1, 00:26:57.532 "num_base_bdevs_operational": 1, 00:26:57.532 "base_bdevs_list": [ 00:26:57.532 { 00:26:57.532 "name": null, 00:26:57.532 "uuid": "00000000-0000-0000-0000-000000000000", 00:26:57.532 "is_configured": false, 00:26:57.532 "data_offset": 256, 00:26:57.532 "data_size": 7936 00:26:57.532 }, 00:26:57.532 { 00:26:57.532 "name": "BaseBdev2", 00:26:57.532 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:57.532 "is_configured": true, 00:26:57.532 "data_offset": 256, 00:26:57.532 "data_size": 7936 00:26:57.532 } 00:26:57.532 ] 00:26:57.532 }' 00:26:57.532 16:06:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:26:57.532 16:06:03 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:26:58.469 16:06:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:26:58.469 [2024-06-10 16:06:03.863167] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:26:58.469 [2024-06-10 16:06:03.863212] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:26:58.469 [2024-06-10 16:06:03.863230] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x281b630 00:26:58.469 [2024-06-10 16:06:03.863240] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:26:58.469 [2024-06-10 16:06:03.863454] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:26:58.469 [2024-06-10 16:06:03.863469] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:26:58.469 [2024-06-10 16:06:03.863524] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:26:58.469 [2024-06-10 16:06:03.863533] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:26:58.469 [2024-06-10 16:06:03.863547] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:26:58.469 [2024-06-10 16:06:03.863561] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:26:58.469 [2024-06-10 16:06:03.865679] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x27d9d90 00:26:58.469 [2024-06-10 16:06:03.867278] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:26:58.469 spare 00:26:58.469 16:06:03 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@762 -- # sleep 1 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=spare 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:26:59.458 16:06:04 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:26:59.717 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:26:59.717 "name": "raid_bdev1", 00:26:59.717 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:26:59.717 "strip_size_kb": 0, 00:26:59.717 "state": "online", 00:26:59.717 "raid_level": "raid1", 00:26:59.717 "superblock": true, 00:26:59.717 "num_base_bdevs": 2, 00:26:59.717 "num_base_bdevs_discovered": 2, 00:26:59.717 "num_base_bdevs_operational": 2, 00:26:59.717 "process": { 00:26:59.717 "type": "rebuild", 00:26:59.717 "target": "spare", 00:26:59.717 "progress": { 00:26:59.717 "blocks": 3072, 00:26:59.717 "percent": 38 00:26:59.717 } 00:26:59.717 }, 00:26:59.717 "base_bdevs_list": [ 00:26:59.717 { 00:26:59.717 "name": "spare", 00:26:59.717 "uuid": "5847e87d-b6b3-5642-8c28-3af15d4a77bf", 00:26:59.717 "is_configured": true, 00:26:59.717 "data_offset": 256, 00:26:59.717 "data_size": 7936 00:26:59.717 }, 00:26:59.717 { 00:26:59.717 "name": "BaseBdev2", 00:26:59.717 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:26:59.717 "is_configured": true, 00:26:59.717 "data_offset": 256, 00:26:59.717 "data_size": 7936 00:26:59.717 } 00:26:59.717 ] 00:26:59.717 }' 00:26:59.717 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:26:59.717 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:26:59.717 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:26:59.976 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:26:59.976 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:26:59.976 [2024-06-10 16:06:05.484477] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:00.234 [2024-06-10 16:06:05.580384] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:00.234 [2024-06-10 16:06:05.580427] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:00.234 [2024-06-10 16:06:05.580441] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:00.234 [2024-06-10 16:06:05.580447] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:00.234 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:00.493 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:00.493 "name": "raid_bdev1", 00:27:00.493 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:00.493 "strip_size_kb": 0, 00:27:00.493 "state": "online", 00:27:00.493 "raid_level": "raid1", 00:27:00.493 "superblock": true, 00:27:00.493 "num_base_bdevs": 2, 00:27:00.493 "num_base_bdevs_discovered": 1, 00:27:00.493 "num_base_bdevs_operational": 1, 00:27:00.493 "base_bdevs_list": [ 00:27:00.493 { 00:27:00.493 "name": null, 00:27:00.493 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:00.493 "is_configured": false, 00:27:00.493 "data_offset": 256, 00:27:00.493 "data_size": 7936 00:27:00.493 }, 00:27:00.493 { 00:27:00.493 "name": "BaseBdev2", 00:27:00.493 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:00.493 "is_configured": true, 00:27:00.493 "data_offset": 256, 00:27:00.493 "data_size": 7936 00:27:00.493 } 00:27:00.493 ] 00:27:00.493 }' 00:27:00.493 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:00.493 16:06:05 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:01.060 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:01.320 "name": "raid_bdev1", 00:27:01.320 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:01.320 "strip_size_kb": 0, 00:27:01.320 "state": "online", 00:27:01.320 "raid_level": "raid1", 00:27:01.320 "superblock": true, 00:27:01.320 "num_base_bdevs": 2, 00:27:01.320 "num_base_bdevs_discovered": 1, 00:27:01.320 "num_base_bdevs_operational": 1, 00:27:01.320 "base_bdevs_list": [ 00:27:01.320 { 00:27:01.320 "name": null, 00:27:01.320 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:01.320 "is_configured": false, 00:27:01.320 "data_offset": 256, 00:27:01.320 "data_size": 7936 00:27:01.320 }, 00:27:01.320 { 00:27:01.320 "name": "BaseBdev2", 00:27:01.320 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:01.320 "is_configured": true, 00:27:01.320 "data_offset": 256, 00:27:01.320 "data_size": 7936 00:27:01.320 } 00:27:01.320 ] 00:27:01.320 }' 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:01.320 16:06:06 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:01.579 16:06:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:01.837 [2024-06-10 16:06:07.307933] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:01.837 [2024-06-10 16:06:07.307984] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:01.837 [2024-06-10 16:06:07.308003] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x2681240 00:27:01.837 [2024-06-10 16:06:07.308012] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:01.837 [2024-06-10 16:06:07.308201] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:01.838 [2024-06-10 16:06:07.308216] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:01.838 [2024-06-10 16:06:07.308256] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:01.838 [2024-06-10 16:06:07.308265] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:01.838 [2024-06-10 16:06:07.308273] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:01.838 BaseBdev1 00:27:01.838 16:06:07 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@773 -- # sleep 1 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:03.215 "name": "raid_bdev1", 00:27:03.215 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:03.215 "strip_size_kb": 0, 00:27:03.215 "state": "online", 00:27:03.215 "raid_level": "raid1", 00:27:03.215 "superblock": true, 00:27:03.215 "num_base_bdevs": 2, 00:27:03.215 "num_base_bdevs_discovered": 1, 00:27:03.215 "num_base_bdevs_operational": 1, 00:27:03.215 "base_bdevs_list": [ 00:27:03.215 { 00:27:03.215 "name": null, 00:27:03.215 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:03.215 "is_configured": false, 00:27:03.215 "data_offset": 256, 00:27:03.215 "data_size": 7936 00:27:03.215 }, 00:27:03.215 { 00:27:03.215 "name": "BaseBdev2", 00:27:03.215 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:03.215 "is_configured": true, 00:27:03.215 "data_offset": 256, 00:27:03.215 "data_size": 7936 00:27:03.215 } 00:27:03.215 ] 00:27:03.215 }' 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:03.215 16:06:08 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:03.783 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:04.042 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:04.042 "name": "raid_bdev1", 00:27:04.042 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:04.042 "strip_size_kb": 0, 00:27:04.042 "state": "online", 00:27:04.042 "raid_level": "raid1", 00:27:04.042 "superblock": true, 00:27:04.042 "num_base_bdevs": 2, 00:27:04.042 "num_base_bdevs_discovered": 1, 00:27:04.042 "num_base_bdevs_operational": 1, 00:27:04.042 "base_bdevs_list": [ 00:27:04.042 { 00:27:04.042 "name": null, 00:27:04.042 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:04.042 "is_configured": false, 00:27:04.042 "data_offset": 256, 00:27:04.042 "data_size": 7936 00:27:04.042 }, 00:27:04.042 { 00:27:04.042 "name": "BaseBdev2", 00:27:04.042 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:04.042 "is_configured": true, 00:27:04.042 "data_offset": 256, 00:27:04.042 "data_size": 7936 00:27:04.042 } 00:27:04.042 ] 00:27:04.043 }' 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@649 -- # local es=0 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:04.043 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:27:04.302 [2024-06-10 16:06:09.774589] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:04.302 [2024-06-10 16:06:09.774711] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:04.302 [2024-06-10 16:06:09.774724] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:04.302 request: 00:27:04.302 { 00:27:04.302 "raid_bdev": "raid_bdev1", 00:27:04.302 "base_bdev": "BaseBdev1", 00:27:04.302 "method": "bdev_raid_add_base_bdev", 00:27:04.302 "req_id": 1 00:27:04.302 } 00:27:04.302 Got JSON-RPC error response 00:27:04.302 response: 00:27:04.302 { 00:27:04.302 "code": -22, 00:27:04.302 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:27:04.302 } 00:27:04.302 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@652 -- # es=1 00:27:04.302 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:27:04.302 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:27:04.302 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:27:04.302 16:06:09 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@777 -- # sleep 1 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:05.680 16:06:10 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:05.680 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:05.680 "name": "raid_bdev1", 00:27:05.680 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:05.680 "strip_size_kb": 0, 00:27:05.680 "state": "online", 00:27:05.680 "raid_level": "raid1", 00:27:05.680 "superblock": true, 00:27:05.680 "num_base_bdevs": 2, 00:27:05.680 "num_base_bdevs_discovered": 1, 00:27:05.680 "num_base_bdevs_operational": 1, 00:27:05.680 "base_bdevs_list": [ 00:27:05.680 { 00:27:05.680 "name": null, 00:27:05.680 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:05.680 "is_configured": false, 00:27:05.680 "data_offset": 256, 00:27:05.680 "data_size": 7936 00:27:05.680 }, 00:27:05.680 { 00:27:05.680 "name": "BaseBdev2", 00:27:05.680 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:05.680 "is_configured": true, 00:27:05.680 "data_offset": 256, 00:27:05.680 "data_size": 7936 00:27:05.680 } 00:27:05.680 ] 00:27:05.680 }' 00:27:05.680 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:05.680 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:06.248 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:06.507 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:06.507 "name": "raid_bdev1", 00:27:06.507 "uuid": "8d305a6d-6289-4afa-ab37-a4c3acfa27cb", 00:27:06.507 "strip_size_kb": 0, 00:27:06.507 "state": "online", 00:27:06.507 "raid_level": "raid1", 00:27:06.507 "superblock": true, 00:27:06.507 "num_base_bdevs": 2, 00:27:06.507 "num_base_bdevs_discovered": 1, 00:27:06.507 "num_base_bdevs_operational": 1, 00:27:06.507 "base_bdevs_list": [ 00:27:06.507 { 00:27:06.507 "name": null, 00:27:06.507 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:06.507 "is_configured": false, 00:27:06.507 "data_offset": 256, 00:27:06.507 "data_size": 7936 00:27:06.507 }, 00:27:06.507 { 00:27:06.507 "name": "BaseBdev2", 00:27:06.507 "uuid": "212901be-3dcb-54e6-a27e-104ff1ef0b89", 00:27:06.507 "is_configured": true, 00:27:06.507 "data_offset": 256, 00:27:06.507 "data_size": 7936 00:27:06.507 } 00:27:06.507 ] 00:27:06.507 }' 00:27:06.507 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:06.507 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:06.507 16:06:11 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@782 -- # killprocess 2817278 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@949 -- # '[' -z 2817278 ']' 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@953 -- # kill -0 2817278 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # uname 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2817278 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2817278' 00:27:06.765 killing process with pid 2817278 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@968 -- # kill 2817278 00:27:06.765 Received shutdown signal, test time was about 60.000000 seconds 00:27:06.765 00:27:06.765 Latency(us) 00:27:06.765 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:27:06.765 =================================================================================================================== 00:27:06.765 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:27:06.765 [2024-06-10 16:06:12.073937] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:06.765 [2024-06-10 16:06:12.074028] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:06.765 [2024-06-10 16:06:12.074076] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:06.765 [2024-06-10 16:06:12.074086] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x281b2b0 name raid_bdev1, state offline 00:27:06.765 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@973 -- # wait 2817278 00:27:06.765 [2024-06-10 16:06:12.104020] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:07.024 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- bdev/bdev_raid.sh@784 -- # return 0 00:27:07.024 00:27:07.024 real 0m31.946s 00:27:07.024 user 0m51.038s 00:27:07.024 sys 0m4.148s 00:27:07.024 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:07.024 16:06:12 bdev_raid.raid_rebuild_test_sb_md_separate -- common/autotest_common.sh@10 -- # set +x 00:27:07.024 ************************************ 00:27:07.024 END TEST raid_rebuild_test_sb_md_separate 00:27:07.024 ************************************ 00:27:07.024 16:06:12 bdev_raid -- bdev/bdev_raid.sh@911 -- # base_malloc_params='-m 32 -i' 00:27:07.024 16:06:12 bdev_raid -- bdev/bdev_raid.sh@912 -- # run_test raid_state_function_test_sb_md_interleaved raid_state_function_test raid1 2 true 00:27:07.024 16:06:12 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:27:07.024 16:06:12 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:07.024 16:06:12 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:07.024 ************************************ 00:27:07.024 START TEST raid_state_function_test_sb_md_interleaved 00:27:07.024 ************************************ 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_state_function_test raid1 2 true 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@220 -- # local raid_level=raid1 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@221 -- # local num_base_bdevs=2 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@222 -- # local superblock=true 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@223 -- # local raid_bdev 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i = 1 )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev1 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # echo BaseBdev2 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i++ )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # (( i <= num_base_bdevs )) 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@224 -- # local base_bdevs 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@225 -- # local raid_bdev_name=Existed_Raid 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@226 -- # local strip_size 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@227 -- # local strip_size_create_arg 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@228 -- # local superblock_create_arg 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@230 -- # '[' raid1 '!=' raid1 ']' 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@234 -- # strip_size=0 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@237 -- # '[' true = true ']' 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@238 -- # superblock_create_arg=-s 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@244 -- # raid_pid=2823427 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@245 -- # echo 'Process raid pid: 2823427' 00:27:07.024 Process raid pid: 2823427 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@243 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -i 0 -L bdev_raid 00:27:07.024 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@246 -- # waitforlisten 2823427 /var/tmp/spdk-raid.sock 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 2823427 ']' 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:07.025 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:07.025 16:06:12 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:07.025 [2024-06-10 16:06:12.425071] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:27:07.025 [2024-06-10 16:06:12.425123] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:27:07.025 [2024-06-10 16:06:12.522295] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:07.284 [2024-06-10 16:06:12.616666] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:07.284 [2024-06-10 16:06:12.671809] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:07.284 [2024-06-10 16:06:12.671838] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@250 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:08.220 [2024-06-10 16:06:13.610680] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:08.220 [2024-06-10 16:06:13.610720] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:08.220 [2024-06-10 16:06:13.610729] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:08.220 [2024-06-10 16:06:13.610738] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@251 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:08.220 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:08.479 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:08.479 "name": "Existed_Raid", 00:27:08.479 "uuid": "9a791c7a-bd02-4b0b-9b13-892ff96cd8ab", 00:27:08.479 "strip_size_kb": 0, 00:27:08.479 "state": "configuring", 00:27:08.479 "raid_level": "raid1", 00:27:08.479 "superblock": true, 00:27:08.479 "num_base_bdevs": 2, 00:27:08.479 "num_base_bdevs_discovered": 0, 00:27:08.479 "num_base_bdevs_operational": 2, 00:27:08.479 "base_bdevs_list": [ 00:27:08.479 { 00:27:08.479 "name": "BaseBdev1", 00:27:08.479 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.480 "is_configured": false, 00:27:08.480 "data_offset": 0, 00:27:08.480 "data_size": 0 00:27:08.480 }, 00:27:08.480 { 00:27:08.480 "name": "BaseBdev2", 00:27:08.480 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:08.480 "is_configured": false, 00:27:08.480 "data_offset": 0, 00:27:08.480 "data_size": 0 00:27:08.480 } 00:27:08.480 ] 00:27:08.480 }' 00:27:08.480 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:08.480 16:06:13 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:09.046 16:06:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@252 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:09.304 [2024-06-10 16:06:14.761594] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:09.304 [2024-06-10 16:06:14.761621] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x873120 name Existed_Raid, state configuring 00:27:09.304 16:06:14 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@256 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:09.563 [2024-06-10 16:06:15.018287] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev1 00:27:09.563 [2024-06-10 16:06:15.018314] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev1 doesn't exist now 00:27:09.563 [2024-06-10 16:06:15.018322] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:09.563 [2024-06-10 16:06:15.018330] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:09.563 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@257 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1 00:27:09.821 [2024-06-10 16:06:15.280444] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:09.821 BaseBdev1 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@258 -- # waitforbdev BaseBdev1 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev1 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:09.821 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:10.080 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 -t 2000 00:27:10.339 [ 00:27:10.339 { 00:27:10.339 "name": "BaseBdev1", 00:27:10.339 "aliases": [ 00:27:10.339 "fea5cebb-34b1-45fe-8e3c-d5a3bf130290" 00:27:10.339 ], 00:27:10.339 "product_name": "Malloc disk", 00:27:10.339 "block_size": 4128, 00:27:10.339 "num_blocks": 8192, 00:27:10.339 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:10.339 "md_size": 32, 00:27:10.339 "md_interleave": true, 00:27:10.339 "dif_type": 0, 00:27:10.339 "assigned_rate_limits": { 00:27:10.339 "rw_ios_per_sec": 0, 00:27:10.339 "rw_mbytes_per_sec": 0, 00:27:10.339 "r_mbytes_per_sec": 0, 00:27:10.339 "w_mbytes_per_sec": 0 00:27:10.339 }, 00:27:10.339 "claimed": true, 00:27:10.339 "claim_type": "exclusive_write", 00:27:10.339 "zoned": false, 00:27:10.339 "supported_io_types": { 00:27:10.339 "read": true, 00:27:10.339 "write": true, 00:27:10.339 "unmap": true, 00:27:10.339 "write_zeroes": true, 00:27:10.339 "flush": true, 00:27:10.339 "reset": true, 00:27:10.339 "compare": false, 00:27:10.339 "compare_and_write": false, 00:27:10.339 "abort": true, 00:27:10.339 "nvme_admin": false, 00:27:10.339 "nvme_io": false 00:27:10.339 }, 00:27:10.339 "memory_domains": [ 00:27:10.339 { 00:27:10.339 "dma_device_id": "system", 00:27:10.339 "dma_device_type": 1 00:27:10.339 }, 00:27:10.339 { 00:27:10.339 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:10.339 "dma_device_type": 2 00:27:10.339 } 00:27:10.339 ], 00:27:10.339 "driver_specific": {} 00:27:10.339 } 00:27:10.339 ] 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@259 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:10.339 16:06:15 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:10.598 16:06:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:10.598 "name": "Existed_Raid", 00:27:10.598 "uuid": "6fc6ab13-57c0-452d-9a25-f660615df4b5", 00:27:10.598 "strip_size_kb": 0, 00:27:10.598 "state": "configuring", 00:27:10.598 "raid_level": "raid1", 00:27:10.598 "superblock": true, 00:27:10.598 "num_base_bdevs": 2, 00:27:10.598 "num_base_bdevs_discovered": 1, 00:27:10.598 "num_base_bdevs_operational": 2, 00:27:10.598 "base_bdevs_list": [ 00:27:10.598 { 00:27:10.598 "name": "BaseBdev1", 00:27:10.598 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:10.598 "is_configured": true, 00:27:10.598 "data_offset": 256, 00:27:10.598 "data_size": 7936 00:27:10.598 }, 00:27:10.598 { 00:27:10.598 "name": "BaseBdev2", 00:27:10.598 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:10.598 "is_configured": false, 00:27:10.598 "data_offset": 0, 00:27:10.598 "data_size": 0 00:27:10.598 } 00:27:10.598 ] 00:27:10.598 }' 00:27:10.598 16:06:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:10.598 16:06:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:11.166 16:06:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@260 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete Existed_Raid 00:27:11.425 [2024-06-10 16:06:16.808559] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: Existed_Raid 00:27:11.425 [2024-06-10 16:06:16.808595] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x8729f0 name Existed_Raid, state configuring 00:27:11.425 16:06:16 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@264 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n Existed_Raid 00:27:11.684 [2024-06-10 16:06:17.069281] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:11.684 [2024-06-10 16:06:17.070786] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: BaseBdev2 00:27:11.684 [2024-06-10 16:06:17.070818] bdev_raid_rpc.c: 311:rpc_bdev_raid_create: *DEBUG*: base bdev BaseBdev2 doesn't exist now 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i = 1 )) 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@266 -- # verify_raid_bdev_state Existed_Raid configuring raid1 0 2 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:11.684 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:11.942 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:11.942 "name": "Existed_Raid", 00:27:11.942 "uuid": "7d48009a-a219-419d-abd6-a579679df94d", 00:27:11.942 "strip_size_kb": 0, 00:27:11.942 "state": "configuring", 00:27:11.942 "raid_level": "raid1", 00:27:11.942 "superblock": true, 00:27:11.942 "num_base_bdevs": 2, 00:27:11.942 "num_base_bdevs_discovered": 1, 00:27:11.942 "num_base_bdevs_operational": 2, 00:27:11.942 "base_bdevs_list": [ 00:27:11.942 { 00:27:11.942 "name": "BaseBdev1", 00:27:11.942 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:11.942 "is_configured": true, 00:27:11.942 "data_offset": 256, 00:27:11.942 "data_size": 7936 00:27:11.942 }, 00:27:11.942 { 00:27:11.943 "name": "BaseBdev2", 00:27:11.943 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:11.943 "is_configured": false, 00:27:11.943 "data_offset": 0, 00:27:11.943 "data_size": 0 00:27:11.943 } 00:27:11.943 ] 00:27:11.943 }' 00:27:11.943 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:11.943 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:12.511 16:06:17 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@267 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2 00:27:12.770 [2024-06-10 16:06:18.127478] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:12.770 [2024-06-10 16:06:18.127604] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x872180 00:27:12.770 [2024-06-10 16:06:18.127616] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:12.770 [2024-06-10 16:06:18.127675] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa0fda0 00:27:12.770 [2024-06-10 16:06:18.127746] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x872180 00:27:12.770 [2024-06-10 16:06:18.127754] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name Existed_Raid, raid_bdev 0x872180 00:27:12.770 [2024-06-10 16:06:18.127807] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:12.770 BaseBdev2 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@268 -- # waitforbdev BaseBdev2 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@898 -- # local bdev_name=BaseBdev2 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@900 -- # local i 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:27:12.770 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_wait_for_examine 00:27:13.029 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 -t 2000 00:27:13.288 [ 00:27:13.288 { 00:27:13.288 "name": "BaseBdev2", 00:27:13.288 "aliases": [ 00:27:13.288 "38453c0f-c7d0-4a98-90a4-8dab9c33bdac" 00:27:13.288 ], 00:27:13.288 "product_name": "Malloc disk", 00:27:13.288 "block_size": 4128, 00:27:13.288 "num_blocks": 8192, 00:27:13.288 "uuid": "38453c0f-c7d0-4a98-90a4-8dab9c33bdac", 00:27:13.288 "md_size": 32, 00:27:13.288 "md_interleave": true, 00:27:13.288 "dif_type": 0, 00:27:13.288 "assigned_rate_limits": { 00:27:13.288 "rw_ios_per_sec": 0, 00:27:13.288 "rw_mbytes_per_sec": 0, 00:27:13.288 "r_mbytes_per_sec": 0, 00:27:13.288 "w_mbytes_per_sec": 0 00:27:13.288 }, 00:27:13.288 "claimed": true, 00:27:13.288 "claim_type": "exclusive_write", 00:27:13.288 "zoned": false, 00:27:13.288 "supported_io_types": { 00:27:13.288 "read": true, 00:27:13.288 "write": true, 00:27:13.288 "unmap": true, 00:27:13.288 "write_zeroes": true, 00:27:13.288 "flush": true, 00:27:13.288 "reset": true, 00:27:13.288 "compare": false, 00:27:13.288 "compare_and_write": false, 00:27:13.288 "abort": true, 00:27:13.288 "nvme_admin": false, 00:27:13.288 "nvme_io": false 00:27:13.288 }, 00:27:13.288 "memory_domains": [ 00:27:13.288 { 00:27:13.288 "dma_device_id": "system", 00:27:13.288 "dma_device_type": 1 00:27:13.288 }, 00:27:13.288 { 00:27:13.288 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:13.288 "dma_device_type": 2 00:27:13.288 } 00:27:13.288 ], 00:27:13.288 "driver_specific": {} 00:27:13.288 } 00:27:13.288 ] 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@906 -- # return 0 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i++ )) 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@265 -- # (( i < num_base_bdevs )) 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@270 -- # verify_raid_bdev_state Existed_Raid online raid1 0 2 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:13.288 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:13.547 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:13.547 "name": "Existed_Raid", 00:27:13.547 "uuid": "7d48009a-a219-419d-abd6-a579679df94d", 00:27:13.547 "strip_size_kb": 0, 00:27:13.547 "state": "online", 00:27:13.547 "raid_level": "raid1", 00:27:13.547 "superblock": true, 00:27:13.547 "num_base_bdevs": 2, 00:27:13.547 "num_base_bdevs_discovered": 2, 00:27:13.547 "num_base_bdevs_operational": 2, 00:27:13.547 "base_bdevs_list": [ 00:27:13.547 { 00:27:13.547 "name": "BaseBdev1", 00:27:13.547 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:13.547 "is_configured": true, 00:27:13.547 "data_offset": 256, 00:27:13.547 "data_size": 7936 00:27:13.547 }, 00:27:13.547 { 00:27:13.547 "name": "BaseBdev2", 00:27:13.547 "uuid": "38453c0f-c7d0-4a98-90a4-8dab9c33bdac", 00:27:13.547 "is_configured": true, 00:27:13.547 "data_offset": 256, 00:27:13.547 "data_size": 7936 00:27:13.547 } 00:27:13.547 ] 00:27:13.547 }' 00:27:13.547 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:13.547 16:06:18 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@271 -- # verify_raid_bdev_properties Existed_Raid 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=Existed_Raid 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:14.155 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b Existed_Raid 00:27:14.414 [2024-06-10 16:06:19.784208] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:14.414 "name": "Existed_Raid", 00:27:14.414 "aliases": [ 00:27:14.414 "7d48009a-a219-419d-abd6-a579679df94d" 00:27:14.414 ], 00:27:14.414 "product_name": "Raid Volume", 00:27:14.414 "block_size": 4128, 00:27:14.414 "num_blocks": 7936, 00:27:14.414 "uuid": "7d48009a-a219-419d-abd6-a579679df94d", 00:27:14.414 "md_size": 32, 00:27:14.414 "md_interleave": true, 00:27:14.414 "dif_type": 0, 00:27:14.414 "assigned_rate_limits": { 00:27:14.414 "rw_ios_per_sec": 0, 00:27:14.414 "rw_mbytes_per_sec": 0, 00:27:14.414 "r_mbytes_per_sec": 0, 00:27:14.414 "w_mbytes_per_sec": 0 00:27:14.414 }, 00:27:14.414 "claimed": false, 00:27:14.414 "zoned": false, 00:27:14.414 "supported_io_types": { 00:27:14.414 "read": true, 00:27:14.414 "write": true, 00:27:14.414 "unmap": false, 00:27:14.414 "write_zeroes": true, 00:27:14.414 "flush": false, 00:27:14.414 "reset": true, 00:27:14.414 "compare": false, 00:27:14.414 "compare_and_write": false, 00:27:14.414 "abort": false, 00:27:14.414 "nvme_admin": false, 00:27:14.414 "nvme_io": false 00:27:14.414 }, 00:27:14.414 "memory_domains": [ 00:27:14.414 { 00:27:14.414 "dma_device_id": "system", 00:27:14.414 "dma_device_type": 1 00:27:14.414 }, 00:27:14.414 { 00:27:14.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:14.414 "dma_device_type": 2 00:27:14.414 }, 00:27:14.414 { 00:27:14.414 "dma_device_id": "system", 00:27:14.414 "dma_device_type": 1 00:27:14.414 }, 00:27:14.414 { 00:27:14.414 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:14.414 "dma_device_type": 2 00:27:14.414 } 00:27:14.414 ], 00:27:14.414 "driver_specific": { 00:27:14.414 "raid": { 00:27:14.414 "uuid": "7d48009a-a219-419d-abd6-a579679df94d", 00:27:14.414 "strip_size_kb": 0, 00:27:14.414 "state": "online", 00:27:14.414 "raid_level": "raid1", 00:27:14.414 "superblock": true, 00:27:14.414 "num_base_bdevs": 2, 00:27:14.414 "num_base_bdevs_discovered": 2, 00:27:14.414 "num_base_bdevs_operational": 2, 00:27:14.414 "base_bdevs_list": [ 00:27:14.414 { 00:27:14.414 "name": "BaseBdev1", 00:27:14.414 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:14.414 "is_configured": true, 00:27:14.414 "data_offset": 256, 00:27:14.414 "data_size": 7936 00:27:14.414 }, 00:27:14.414 { 00:27:14.414 "name": "BaseBdev2", 00:27:14.414 "uuid": "38453c0f-c7d0-4a98-90a4-8dab9c33bdac", 00:27:14.414 "is_configured": true, 00:27:14.414 "data_offset": 256, 00:27:14.414 "data_size": 7936 00:27:14.414 } 00:27:14.414 ] 00:27:14.414 } 00:27:14.414 } 00:27:14.414 }' 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='BaseBdev1 00:27:14.414 BaseBdev2' 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev1 00:27:14.414 16:06:19 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:14.673 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:14.673 "name": "BaseBdev1", 00:27:14.673 "aliases": [ 00:27:14.673 "fea5cebb-34b1-45fe-8e3c-d5a3bf130290" 00:27:14.673 ], 00:27:14.673 "product_name": "Malloc disk", 00:27:14.673 "block_size": 4128, 00:27:14.673 "num_blocks": 8192, 00:27:14.673 "uuid": "fea5cebb-34b1-45fe-8e3c-d5a3bf130290", 00:27:14.673 "md_size": 32, 00:27:14.673 "md_interleave": true, 00:27:14.673 "dif_type": 0, 00:27:14.673 "assigned_rate_limits": { 00:27:14.673 "rw_ios_per_sec": 0, 00:27:14.673 "rw_mbytes_per_sec": 0, 00:27:14.673 "r_mbytes_per_sec": 0, 00:27:14.673 "w_mbytes_per_sec": 0 00:27:14.673 }, 00:27:14.673 "claimed": true, 00:27:14.673 "claim_type": "exclusive_write", 00:27:14.673 "zoned": false, 00:27:14.673 "supported_io_types": { 00:27:14.673 "read": true, 00:27:14.673 "write": true, 00:27:14.673 "unmap": true, 00:27:14.673 "write_zeroes": true, 00:27:14.673 "flush": true, 00:27:14.673 "reset": true, 00:27:14.673 "compare": false, 00:27:14.673 "compare_and_write": false, 00:27:14.673 "abort": true, 00:27:14.673 "nvme_admin": false, 00:27:14.673 "nvme_io": false 00:27:14.673 }, 00:27:14.673 "memory_domains": [ 00:27:14.673 { 00:27:14.673 "dma_device_id": "system", 00:27:14.673 "dma_device_type": 1 00:27:14.673 }, 00:27:14.673 { 00:27:14.673 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:14.673 "dma_device_type": 2 00:27:14.673 } 00:27:14.673 ], 00:27:14.673 "driver_specific": {} 00:27:14.673 }' 00:27:14.673 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:14.673 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:14.932 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.190 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.190 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:15.191 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:15.191 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b BaseBdev2 00:27:15.191 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:15.450 "name": "BaseBdev2", 00:27:15.450 "aliases": [ 00:27:15.450 "38453c0f-c7d0-4a98-90a4-8dab9c33bdac" 00:27:15.450 ], 00:27:15.450 "product_name": "Malloc disk", 00:27:15.450 "block_size": 4128, 00:27:15.450 "num_blocks": 8192, 00:27:15.450 "uuid": "38453c0f-c7d0-4a98-90a4-8dab9c33bdac", 00:27:15.450 "md_size": 32, 00:27:15.450 "md_interleave": true, 00:27:15.450 "dif_type": 0, 00:27:15.450 "assigned_rate_limits": { 00:27:15.450 "rw_ios_per_sec": 0, 00:27:15.450 "rw_mbytes_per_sec": 0, 00:27:15.450 "r_mbytes_per_sec": 0, 00:27:15.450 "w_mbytes_per_sec": 0 00:27:15.450 }, 00:27:15.450 "claimed": true, 00:27:15.450 "claim_type": "exclusive_write", 00:27:15.450 "zoned": false, 00:27:15.450 "supported_io_types": { 00:27:15.450 "read": true, 00:27:15.450 "write": true, 00:27:15.450 "unmap": true, 00:27:15.450 "write_zeroes": true, 00:27:15.450 "flush": true, 00:27:15.450 "reset": true, 00:27:15.450 "compare": false, 00:27:15.450 "compare_and_write": false, 00:27:15.450 "abort": true, 00:27:15.450 "nvme_admin": false, 00:27:15.450 "nvme_io": false 00:27:15.450 }, 00:27:15.450 "memory_domains": [ 00:27:15.450 { 00:27:15.450 "dma_device_id": "system", 00:27:15.450 "dma_device_type": 1 00:27:15.450 }, 00:27:15.450 { 00:27:15.450 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:15.450 "dma_device_type": 2 00:27:15.450 } 00:27:15.450 ], 00:27:15.450 "driver_specific": {} 00:27:15.450 }' 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:15.450 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:15.709 16:06:20 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:15.709 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:15.709 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.709 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:15.709 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:15.709 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@274 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev1 00:27:15.967 [2024-06-10 16:06:21.360249] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@275 -- # local expected_state 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@276 -- # has_redundancy raid1 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@279 -- # expected_state=online 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@281 -- # verify_raid_bdev_state Existed_Raid online raid1 0 1 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=Existed_Raid 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:15.967 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "Existed_Raid")' 00:27:16.226 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:16.226 "name": "Existed_Raid", 00:27:16.226 "uuid": "7d48009a-a219-419d-abd6-a579679df94d", 00:27:16.226 "strip_size_kb": 0, 00:27:16.226 "state": "online", 00:27:16.226 "raid_level": "raid1", 00:27:16.226 "superblock": true, 00:27:16.226 "num_base_bdevs": 2, 00:27:16.226 "num_base_bdevs_discovered": 1, 00:27:16.226 "num_base_bdevs_operational": 1, 00:27:16.226 "base_bdevs_list": [ 00:27:16.226 { 00:27:16.226 "name": null, 00:27:16.226 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:16.226 "is_configured": false, 00:27:16.226 "data_offset": 256, 00:27:16.226 "data_size": 7936 00:27:16.226 }, 00:27:16.226 { 00:27:16.226 "name": "BaseBdev2", 00:27:16.226 "uuid": "38453c0f-c7d0-4a98-90a4-8dab9c33bdac", 00:27:16.226 "is_configured": true, 00:27:16.226 "data_offset": 256, 00:27:16.226 "data_size": 7936 00:27:16.226 } 00:27:16.226 ] 00:27:16.226 }' 00:27:16.226 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:16.226 16:06:21 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:16.803 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i = 1 )) 00:27:16.803 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:16.803 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:16.803 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # jq -r '.[0]["name"]' 00:27:17.066 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@286 -- # raid_bdev=Existed_Raid 00:27:17.066 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@287 -- # '[' Existed_Raid '!=' Existed_Raid ']' 00:27:17.066 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@291 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_delete BaseBdev2 00:27:17.324 [2024-06-10 16:06:22.733239] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev2 00:27:17.324 [2024-06-10 16:06:22.733325] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:17.324 [2024-06-10 16:06:22.744245] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:17.324 [2024-06-10 16:06:22.744279] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:17.324 [2024-06-10 16:06:22.744287] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x872180 name Existed_Raid, state offline 00:27:17.324 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i++ )) 00:27:17.324 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@285 -- # (( i < num_base_bdevs )) 00:27:17.324 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # jq -r '.[0]["name"] | select(.)' 00:27:17.324 16:06:22 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@293 -- # raid_bdev= 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@294 -- # '[' -n '' ']' 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@299 -- # '[' 2 -gt 2 ']' 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@341 -- # killprocess 2823427 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 2823427 ']' 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 2823427 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2823427 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2823427' 00:27:17.581 killing process with pid 2823427 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 2823427 00:27:17.581 [2024-06-10 16:06:23.062160] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:17.581 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 2823427 00:27:17.581 [2024-06-10 16:06:23.063016] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:17.839 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- bdev/bdev_raid.sh@343 -- # return 0 00:27:17.839 00:27:17.839 real 0m10.893s 00:27:17.839 user 0m19.883s 00:27:17.839 sys 0m1.539s 00:27:17.839 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:17.839 16:06:23 bdev_raid.raid_state_function_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:17.839 ************************************ 00:27:17.839 END TEST raid_state_function_test_sb_md_interleaved 00:27:17.839 ************************************ 00:27:17.839 16:06:23 bdev_raid -- bdev/bdev_raid.sh@913 -- # run_test raid_superblock_test_md_interleaved raid_superblock_test raid1 2 00:27:17.839 16:06:23 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 4 -le 1 ']' 00:27:17.839 16:06:23 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:17.839 16:06:23 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:17.839 ************************************ 00:27:17.839 START TEST raid_superblock_test_md_interleaved 00:27:17.839 ************************************ 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1124 -- # raid_superblock_test raid1 2 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@392 -- # local raid_level=raid1 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@393 -- # local num_base_bdevs=2 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # base_bdevs_malloc=() 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@394 -- # local base_bdevs_malloc 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # base_bdevs_pt=() 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@395 -- # local base_bdevs_pt 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # base_bdevs_pt_uuid=() 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@396 -- # local base_bdevs_pt_uuid 00:27:17.839 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@397 -- # local raid_bdev_name=raid_bdev1 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@398 -- # local strip_size 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@399 -- # local strip_size_create_arg 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@400 -- # local raid_bdev_uuid 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@401 -- # local raid_bdev 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@403 -- # '[' raid1 '!=' raid1 ']' 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@407 -- # strip_size=0 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@411 -- # raid_pid=2825458 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@412 -- # waitforlisten 2825458 /var/tmp/spdk-raid.sock 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@410 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-raid.sock -L bdev_raid 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 2825458 ']' 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:17.840 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:17.840 16:06:23 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:18.097 [2024-06-10 16:06:23.390314] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:27:18.097 [2024-06-10 16:06:23.390367] [ DPDK EAL parameters: bdev_svc --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2825458 ] 00:27:18.097 [2024-06-10 16:06:23.487638] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:18.097 [2024-06-10 16:06:23.582461] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:18.355 [2024-06-10 16:06:23.642046] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:18.355 [2024-06-10 16:06:23.642084] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i = 1 )) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc1 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt1 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000001 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:18.922 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc1 00:27:19.181 malloc1 00:27:19.181 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:19.440 [2024-06-10 16:06:24.824035] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:19.440 [2024-06-10 16:06:24.824079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.440 [2024-06-10 16:06:24.824097] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa95970 00:27:19.440 [2024-06-10 16:06:24.824107] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.440 [2024-06-10 16:06:24.825632] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.440 [2024-06-10 16:06:24.825658] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:19.440 pt1 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@416 -- # local bdev_malloc=malloc2 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@417 -- # local bdev_pt=pt2 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@418 -- # local bdev_pt_uuid=00000000-0000-0000-0000-000000000002 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@420 -- # base_bdevs_malloc+=($bdev_malloc) 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@421 -- # base_bdevs_pt+=($bdev_pt) 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@422 -- # base_bdevs_pt_uuid+=($bdev_pt_uuid) 00:27:19.440 16:06:24 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@424 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b malloc2 00:27:19.699 malloc2 00:27:19.699 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@425 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:19.957 [2024-06-10 16:06:25.330141] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:19.957 [2024-06-10 16:06:25.330180] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:19.957 [2024-06-10 16:06:25.330195] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc232c0 00:27:19.957 [2024-06-10 16:06:25.330204] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:19.957 [2024-06-10 16:06:25.331672] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:19.957 [2024-06-10 16:06:25.331697] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:19.957 pt2 00:27:19.957 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i++ )) 00:27:19.957 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@415 -- # (( i <= num_base_bdevs )) 00:27:19.957 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@429 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'pt1 pt2' -n raid_bdev1 -s 00:27:20.214 [2024-06-10 16:06:25.570789] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:20.214 [2024-06-10 16:06:25.572387] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:20.214 [2024-06-10 16:06:25.572544] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc16c50 00:27:20.214 [2024-06-10 16:06:25.572557] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:20.214 [2024-06-10 16:06:25.572623] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa939c0 00:27:20.214 [2024-06-10 16:06:25.572708] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc16c50 00:27:20.214 [2024-06-10 16:06:25.572716] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc16c50 00:27:20.214 [2024-06-10 16:06:25.572774] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@430 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:20.214 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:20.472 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:20.472 "name": "raid_bdev1", 00:27:20.472 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:20.472 "strip_size_kb": 0, 00:27:20.472 "state": "online", 00:27:20.472 "raid_level": "raid1", 00:27:20.472 "superblock": true, 00:27:20.472 "num_base_bdevs": 2, 00:27:20.472 "num_base_bdevs_discovered": 2, 00:27:20.472 "num_base_bdevs_operational": 2, 00:27:20.472 "base_bdevs_list": [ 00:27:20.472 { 00:27:20.472 "name": "pt1", 00:27:20.472 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:20.472 "is_configured": true, 00:27:20.472 "data_offset": 256, 00:27:20.472 "data_size": 7936 00:27:20.472 }, 00:27:20.472 { 00:27:20.472 "name": "pt2", 00:27:20.472 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:20.472 "is_configured": true, 00:27:20.472 "data_offset": 256, 00:27:20.472 "data_size": 7936 00:27:20.472 } 00:27:20.472 ] 00:27:20.472 }' 00:27:20.472 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:20.472 16:06:25 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@431 -- # verify_raid_bdev_properties raid_bdev1 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:21.039 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:21.296 [2024-06-10 16:06:26.714083] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:21.296 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:21.296 "name": "raid_bdev1", 00:27:21.296 "aliases": [ 00:27:21.296 "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b" 00:27:21.296 ], 00:27:21.296 "product_name": "Raid Volume", 00:27:21.296 "block_size": 4128, 00:27:21.296 "num_blocks": 7936, 00:27:21.296 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:21.296 "md_size": 32, 00:27:21.296 "md_interleave": true, 00:27:21.296 "dif_type": 0, 00:27:21.296 "assigned_rate_limits": { 00:27:21.296 "rw_ios_per_sec": 0, 00:27:21.296 "rw_mbytes_per_sec": 0, 00:27:21.296 "r_mbytes_per_sec": 0, 00:27:21.296 "w_mbytes_per_sec": 0 00:27:21.296 }, 00:27:21.296 "claimed": false, 00:27:21.296 "zoned": false, 00:27:21.296 "supported_io_types": { 00:27:21.296 "read": true, 00:27:21.296 "write": true, 00:27:21.296 "unmap": false, 00:27:21.297 "write_zeroes": true, 00:27:21.297 "flush": false, 00:27:21.297 "reset": true, 00:27:21.297 "compare": false, 00:27:21.297 "compare_and_write": false, 00:27:21.297 "abort": false, 00:27:21.297 "nvme_admin": false, 00:27:21.297 "nvme_io": false 00:27:21.297 }, 00:27:21.297 "memory_domains": [ 00:27:21.297 { 00:27:21.297 "dma_device_id": "system", 00:27:21.297 "dma_device_type": 1 00:27:21.297 }, 00:27:21.297 { 00:27:21.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:21.297 "dma_device_type": 2 00:27:21.297 }, 00:27:21.297 { 00:27:21.297 "dma_device_id": "system", 00:27:21.297 "dma_device_type": 1 00:27:21.297 }, 00:27:21.297 { 00:27:21.297 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:21.297 "dma_device_type": 2 00:27:21.297 } 00:27:21.297 ], 00:27:21.297 "driver_specific": { 00:27:21.297 "raid": { 00:27:21.297 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:21.297 "strip_size_kb": 0, 00:27:21.297 "state": "online", 00:27:21.297 "raid_level": "raid1", 00:27:21.297 "superblock": true, 00:27:21.297 "num_base_bdevs": 2, 00:27:21.297 "num_base_bdevs_discovered": 2, 00:27:21.297 "num_base_bdevs_operational": 2, 00:27:21.297 "base_bdevs_list": [ 00:27:21.297 { 00:27:21.297 "name": "pt1", 00:27:21.297 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:21.297 "is_configured": true, 00:27:21.297 "data_offset": 256, 00:27:21.297 "data_size": 7936 00:27:21.297 }, 00:27:21.297 { 00:27:21.297 "name": "pt2", 00:27:21.297 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:21.297 "is_configured": true, 00:27:21.297 "data_offset": 256, 00:27:21.297 "data_size": 7936 00:27:21.297 } 00:27:21.297 ] 00:27:21.297 } 00:27:21.297 } 00:27:21.297 }' 00:27:21.297 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:21.297 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:21.297 pt2' 00:27:21.297 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:21.297 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:21.297 16:06:26 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:21.555 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:21.555 "name": "pt1", 00:27:21.555 "aliases": [ 00:27:21.555 "00000000-0000-0000-0000-000000000001" 00:27:21.555 ], 00:27:21.555 "product_name": "passthru", 00:27:21.555 "block_size": 4128, 00:27:21.555 "num_blocks": 8192, 00:27:21.555 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:21.555 "md_size": 32, 00:27:21.555 "md_interleave": true, 00:27:21.555 "dif_type": 0, 00:27:21.555 "assigned_rate_limits": { 00:27:21.555 "rw_ios_per_sec": 0, 00:27:21.555 "rw_mbytes_per_sec": 0, 00:27:21.555 "r_mbytes_per_sec": 0, 00:27:21.555 "w_mbytes_per_sec": 0 00:27:21.555 }, 00:27:21.555 "claimed": true, 00:27:21.555 "claim_type": "exclusive_write", 00:27:21.555 "zoned": false, 00:27:21.555 "supported_io_types": { 00:27:21.555 "read": true, 00:27:21.555 "write": true, 00:27:21.555 "unmap": true, 00:27:21.555 "write_zeroes": true, 00:27:21.555 "flush": true, 00:27:21.555 "reset": true, 00:27:21.555 "compare": false, 00:27:21.555 "compare_and_write": false, 00:27:21.555 "abort": true, 00:27:21.555 "nvme_admin": false, 00:27:21.555 "nvme_io": false 00:27:21.555 }, 00:27:21.555 "memory_domains": [ 00:27:21.555 { 00:27:21.555 "dma_device_id": "system", 00:27:21.555 "dma_device_type": 1 00:27:21.555 }, 00:27:21.555 { 00:27:21.555 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:21.555 "dma_device_type": 2 00:27:21.555 } 00:27:21.555 ], 00:27:21.555 "driver_specific": { 00:27:21.555 "passthru": { 00:27:21.555 "name": "pt1", 00:27:21.555 "base_bdev_name": "malloc1" 00:27:21.555 } 00:27:21.555 } 00:27:21.555 }' 00:27:21.555 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:21.814 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:22.072 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:22.330 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:22.330 "name": "pt2", 00:27:22.330 "aliases": [ 00:27:22.330 "00000000-0000-0000-0000-000000000002" 00:27:22.330 ], 00:27:22.330 "product_name": "passthru", 00:27:22.330 "block_size": 4128, 00:27:22.330 "num_blocks": 8192, 00:27:22.330 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:22.330 "md_size": 32, 00:27:22.330 "md_interleave": true, 00:27:22.330 "dif_type": 0, 00:27:22.330 "assigned_rate_limits": { 00:27:22.330 "rw_ios_per_sec": 0, 00:27:22.330 "rw_mbytes_per_sec": 0, 00:27:22.330 "r_mbytes_per_sec": 0, 00:27:22.330 "w_mbytes_per_sec": 0 00:27:22.330 }, 00:27:22.330 "claimed": true, 00:27:22.330 "claim_type": "exclusive_write", 00:27:22.330 "zoned": false, 00:27:22.330 "supported_io_types": { 00:27:22.330 "read": true, 00:27:22.330 "write": true, 00:27:22.330 "unmap": true, 00:27:22.330 "write_zeroes": true, 00:27:22.330 "flush": true, 00:27:22.330 "reset": true, 00:27:22.330 "compare": false, 00:27:22.330 "compare_and_write": false, 00:27:22.330 "abort": true, 00:27:22.330 "nvme_admin": false, 00:27:22.330 "nvme_io": false 00:27:22.330 }, 00:27:22.330 "memory_domains": [ 00:27:22.330 { 00:27:22.330 "dma_device_id": "system", 00:27:22.330 "dma_device_type": 1 00:27:22.330 }, 00:27:22.330 { 00:27:22.330 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:22.330 "dma_device_type": 2 00:27:22.330 } 00:27:22.330 ], 00:27:22.330 "driver_specific": { 00:27:22.330 "passthru": { 00:27:22.330 "name": "pt2", 00:27:22.330 "base_bdev_name": "malloc2" 00:27:22.330 } 00:27:22.330 } 00:27:22.330 }' 00:27:22.330 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:22.330 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:22.331 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:22.331 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:22.331 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:22.589 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:22.589 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:22.589 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:22.589 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:22.589 16:06:27 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:22.589 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:22.589 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:22.589 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:22.589 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # jq -r '.[] | .uuid' 00:27:22.847 [2024-06-10 16:06:28.286264] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:22.847 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@434 -- # raid_bdev_uuid=2faa8070-4ae6-4da9-8765-5f5ed5b8e97b 00:27:22.847 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@435 -- # '[' -z 2faa8070-4ae6-4da9-8765-5f5ed5b8e97b ']' 00:27:22.847 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@440 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:23.105 [2024-06-10 16:06:28.542719] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:23.105 [2024-06-10 16:06:28.542738] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:23.105 [2024-06-10 16:06:28.542791] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:23.105 [2024-06-10 16:06:28.542845] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:23.105 [2024-06-10 16:06:28.542854] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc16c50 name raid_bdev1, state offline 00:27:23.105 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:23.105 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # jq -r '.[]' 00:27:23.363 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@441 -- # raid_bdev= 00:27:23.363 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@442 -- # '[' -n '' ']' 00:27:23.363 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:23.363 16:06:28 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:23.622 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@447 -- # for i in "${base_bdevs_pt[@]}" 00:27:23.622 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@448 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:23.881 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs 00:27:23.881 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # jq -r '[.[] | select(.product_name == "passthru")] | any' 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@450 -- # '[' false == true ']' 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@456 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:27:24.140 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -r raid1 -b 'malloc1 malloc2' -n raid_bdev1 00:27:24.399 [2024-06-10 16:06:29.822079] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc1 is claimed 00:27:24.399 [2024-06-10 16:06:29.823758] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev malloc2 is claimed 00:27:24.399 [2024-06-10 16:06:29.823816] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc1 00:27:24.399 [2024-06-10 16:06:29.823852] bdev_raid.c:3106:raid_bdev_configure_base_bdev_check_sb_cb: *ERROR*: Superblock of a different raid bdev found on bdev malloc2 00:27:24.399 [2024-06-10 16:06:29.823868] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:24.399 [2024-06-10 16:06:29.823876] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc17920 name raid_bdev1, state configuring 00:27:24.399 request: 00:27:24.399 { 00:27:24.399 "name": "raid_bdev1", 00:27:24.399 "raid_level": "raid1", 00:27:24.399 "base_bdevs": [ 00:27:24.399 "malloc1", 00:27:24.399 "malloc2" 00:27:24.399 ], 00:27:24.399 "superblock": false, 00:27:24.399 "method": "bdev_raid_create", 00:27:24.399 "req_id": 1 00:27:24.399 } 00:27:24.399 Got JSON-RPC error response 00:27:24.399 response: 00:27:24.399 { 00:27:24.399 "code": -17, 00:27:24.399 "message": "Failed to create RAID bdev raid_bdev1: File exists" 00:27:24.399 } 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.399 16:06:29 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # jq -r '.[]' 00:27:24.658 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@458 -- # raid_bdev= 00:27:24.658 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@459 -- # '[' -n '' ']' 00:27:24.658 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@464 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:24.917 [2024-06-10 16:06:30.335388] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:24.917 [2024-06-10 16:06:30.335421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:24.917 [2024-06-10 16:06:30.335436] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa93bb0 00:27:24.917 [2024-06-10 16:06:30.335446] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:24.917 [2024-06-10 16:06:30.336905] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:24.917 [2024-06-10 16:06:30.336929] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:24.917 [2024-06-10 16:06:30.336981] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:24.917 [2024-06-10 16:06:30.337005] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:24.917 pt1 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@467 -- # verify_raid_bdev_state raid_bdev1 configuring raid1 0 2 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=configuring 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:24.917 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:25.189 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:25.189 "name": "raid_bdev1", 00:27:25.189 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:25.189 "strip_size_kb": 0, 00:27:25.189 "state": "configuring", 00:27:25.189 "raid_level": "raid1", 00:27:25.189 "superblock": true, 00:27:25.189 "num_base_bdevs": 2, 00:27:25.189 "num_base_bdevs_discovered": 1, 00:27:25.189 "num_base_bdevs_operational": 2, 00:27:25.189 "base_bdevs_list": [ 00:27:25.189 { 00:27:25.189 "name": "pt1", 00:27:25.189 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:25.189 "is_configured": true, 00:27:25.189 "data_offset": 256, 00:27:25.189 "data_size": 7936 00:27:25.189 }, 00:27:25.189 { 00:27:25.189 "name": null, 00:27:25.189 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:25.189 "is_configured": false, 00:27:25.189 "data_offset": 256, 00:27:25.189 "data_size": 7936 00:27:25.189 } 00:27:25.189 ] 00:27:25.189 }' 00:27:25.189 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:25.189 16:06:30 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:25.757 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@469 -- # '[' 2 -gt 2 ']' 00:27:25.757 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i = 1 )) 00:27:25.757 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:25.757 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@478 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:26.016 [2024-06-10 16:06:31.470430] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:26.016 [2024-06-10 16:06:31.470476] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:26.016 [2024-06-10 16:06:31.470492] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc18790 00:27:26.016 [2024-06-10 16:06:31.470502] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:26.016 [2024-06-10 16:06:31.470661] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:26.016 [2024-06-10 16:06:31.470674] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:26.016 [2024-06-10 16:06:31.470713] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:26.016 [2024-06-10 16:06:31.470730] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:26.016 [2024-06-10 16:06:31.470814] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xa94680 00:27:26.016 [2024-06-10 16:06:31.470823] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:26.016 [2024-06-10 16:06:31.470876] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa939c0 00:27:26.016 [2024-06-10 16:06:31.470953] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xa94680 00:27:26.016 [2024-06-10 16:06:31.470971] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xa94680 00:27:26.016 [2024-06-10 16:06:31.471029] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:26.016 pt2 00:27:26.016 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i++ )) 00:27:26.016 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@477 -- # (( i < num_base_bdevs )) 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@482 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:26.017 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:26.275 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:26.275 "name": "raid_bdev1", 00:27:26.275 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:26.275 "strip_size_kb": 0, 00:27:26.275 "state": "online", 00:27:26.275 "raid_level": "raid1", 00:27:26.276 "superblock": true, 00:27:26.276 "num_base_bdevs": 2, 00:27:26.276 "num_base_bdevs_discovered": 2, 00:27:26.276 "num_base_bdevs_operational": 2, 00:27:26.276 "base_bdevs_list": [ 00:27:26.276 { 00:27:26.276 "name": "pt1", 00:27:26.276 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:26.276 "is_configured": true, 00:27:26.276 "data_offset": 256, 00:27:26.276 "data_size": 7936 00:27:26.276 }, 00:27:26.276 { 00:27:26.276 "name": "pt2", 00:27:26.276 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:26.276 "is_configured": true, 00:27:26.276 "data_offset": 256, 00:27:26.276 "data_size": 7936 00:27:26.276 } 00:27:26.276 ] 00:27:26.276 }' 00:27:26.276 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:26.276 16:06:31 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@483 -- # verify_raid_bdev_properties raid_bdev1 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@194 -- # local raid_bdev_name=raid_bdev1 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@195 -- # local raid_bdev_info 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@196 -- # local base_bdev_info 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@197 -- # local base_bdev_names 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@198 -- # local name 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # jq '.[]' 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:27.211 [2024-06-10 16:06:32.605725] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:27.211 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@200 -- # raid_bdev_info='{ 00:27:27.211 "name": "raid_bdev1", 00:27:27.211 "aliases": [ 00:27:27.211 "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b" 00:27:27.211 ], 00:27:27.211 "product_name": "Raid Volume", 00:27:27.211 "block_size": 4128, 00:27:27.211 "num_blocks": 7936, 00:27:27.211 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:27.211 "md_size": 32, 00:27:27.211 "md_interleave": true, 00:27:27.211 "dif_type": 0, 00:27:27.211 "assigned_rate_limits": { 00:27:27.211 "rw_ios_per_sec": 0, 00:27:27.211 "rw_mbytes_per_sec": 0, 00:27:27.211 "r_mbytes_per_sec": 0, 00:27:27.211 "w_mbytes_per_sec": 0 00:27:27.212 }, 00:27:27.212 "claimed": false, 00:27:27.212 "zoned": false, 00:27:27.212 "supported_io_types": { 00:27:27.212 "read": true, 00:27:27.212 "write": true, 00:27:27.212 "unmap": false, 00:27:27.212 "write_zeroes": true, 00:27:27.212 "flush": false, 00:27:27.212 "reset": true, 00:27:27.212 "compare": false, 00:27:27.212 "compare_and_write": false, 00:27:27.212 "abort": false, 00:27:27.212 "nvme_admin": false, 00:27:27.212 "nvme_io": false 00:27:27.212 }, 00:27:27.212 "memory_domains": [ 00:27:27.212 { 00:27:27.212 "dma_device_id": "system", 00:27:27.212 "dma_device_type": 1 00:27:27.212 }, 00:27:27.212 { 00:27:27.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.212 "dma_device_type": 2 00:27:27.212 }, 00:27:27.212 { 00:27:27.212 "dma_device_id": "system", 00:27:27.212 "dma_device_type": 1 00:27:27.212 }, 00:27:27.212 { 00:27:27.212 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.212 "dma_device_type": 2 00:27:27.212 } 00:27:27.212 ], 00:27:27.212 "driver_specific": { 00:27:27.212 "raid": { 00:27:27.212 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:27.212 "strip_size_kb": 0, 00:27:27.212 "state": "online", 00:27:27.212 "raid_level": "raid1", 00:27:27.212 "superblock": true, 00:27:27.212 "num_base_bdevs": 2, 00:27:27.212 "num_base_bdevs_discovered": 2, 00:27:27.212 "num_base_bdevs_operational": 2, 00:27:27.212 "base_bdevs_list": [ 00:27:27.212 { 00:27:27.212 "name": "pt1", 00:27:27.212 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:27.212 "is_configured": true, 00:27:27.212 "data_offset": 256, 00:27:27.212 "data_size": 7936 00:27:27.212 }, 00:27:27.212 { 00:27:27.212 "name": "pt2", 00:27:27.212 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:27.212 "is_configured": true, 00:27:27.212 "data_offset": 256, 00:27:27.212 "data_size": 7936 00:27:27.212 } 00:27:27.212 ] 00:27:27.212 } 00:27:27.212 } 00:27:27.212 }' 00:27:27.212 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # jq -r '.driver_specific.raid.base_bdevs_list[] | select(.is_configured == true).name' 00:27:27.212 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@201 -- # base_bdev_names='pt1 00:27:27.212 pt2' 00:27:27.212 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:27.212 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt1 00:27:27.212 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:27.471 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:27.471 "name": "pt1", 00:27:27.471 "aliases": [ 00:27:27.471 "00000000-0000-0000-0000-000000000001" 00:27:27.471 ], 00:27:27.471 "product_name": "passthru", 00:27:27.471 "block_size": 4128, 00:27:27.471 "num_blocks": 8192, 00:27:27.471 "uuid": "00000000-0000-0000-0000-000000000001", 00:27:27.471 "md_size": 32, 00:27:27.471 "md_interleave": true, 00:27:27.471 "dif_type": 0, 00:27:27.471 "assigned_rate_limits": { 00:27:27.471 "rw_ios_per_sec": 0, 00:27:27.471 "rw_mbytes_per_sec": 0, 00:27:27.471 "r_mbytes_per_sec": 0, 00:27:27.471 "w_mbytes_per_sec": 0 00:27:27.471 }, 00:27:27.471 "claimed": true, 00:27:27.471 "claim_type": "exclusive_write", 00:27:27.471 "zoned": false, 00:27:27.471 "supported_io_types": { 00:27:27.471 "read": true, 00:27:27.471 "write": true, 00:27:27.471 "unmap": true, 00:27:27.471 "write_zeroes": true, 00:27:27.471 "flush": true, 00:27:27.471 "reset": true, 00:27:27.471 "compare": false, 00:27:27.471 "compare_and_write": false, 00:27:27.471 "abort": true, 00:27:27.471 "nvme_admin": false, 00:27:27.471 "nvme_io": false 00:27:27.471 }, 00:27:27.471 "memory_domains": [ 00:27:27.471 { 00:27:27.471 "dma_device_id": "system", 00:27:27.471 "dma_device_type": 1 00:27:27.471 }, 00:27:27.471 { 00:27:27.471 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:27.471 "dma_device_type": 2 00:27:27.471 } 00:27:27.471 ], 00:27:27.471 "driver_specific": { 00:27:27.471 "passthru": { 00:27:27.471 "name": "pt1", 00:27:27.471 "base_bdev_name": "malloc1" 00:27:27.471 } 00:27:27.471 } 00:27:27.471 }' 00:27:27.471 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:27.730 16:06:32 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:27.730 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:27.994 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:27.994 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:27.994 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@203 -- # for name in $base_bdev_names 00:27:27.994 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b pt2 00:27:27.994 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # jq '.[]' 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@204 -- # base_bdev_info='{ 00:27:28.326 "name": "pt2", 00:27:28.326 "aliases": [ 00:27:28.326 "00000000-0000-0000-0000-000000000002" 00:27:28.326 ], 00:27:28.326 "product_name": "passthru", 00:27:28.326 "block_size": 4128, 00:27:28.326 "num_blocks": 8192, 00:27:28.326 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:28.326 "md_size": 32, 00:27:28.326 "md_interleave": true, 00:27:28.326 "dif_type": 0, 00:27:28.326 "assigned_rate_limits": { 00:27:28.326 "rw_ios_per_sec": 0, 00:27:28.326 "rw_mbytes_per_sec": 0, 00:27:28.326 "r_mbytes_per_sec": 0, 00:27:28.326 "w_mbytes_per_sec": 0 00:27:28.326 }, 00:27:28.326 "claimed": true, 00:27:28.326 "claim_type": "exclusive_write", 00:27:28.326 "zoned": false, 00:27:28.326 "supported_io_types": { 00:27:28.326 "read": true, 00:27:28.326 "write": true, 00:27:28.326 "unmap": true, 00:27:28.326 "write_zeroes": true, 00:27:28.326 "flush": true, 00:27:28.326 "reset": true, 00:27:28.326 "compare": false, 00:27:28.326 "compare_and_write": false, 00:27:28.326 "abort": true, 00:27:28.326 "nvme_admin": false, 00:27:28.326 "nvme_io": false 00:27:28.326 }, 00:27:28.326 "memory_domains": [ 00:27:28.326 { 00:27:28.326 "dma_device_id": "system", 00:27:28.326 "dma_device_type": 1 00:27:28.326 }, 00:27:28.326 { 00:27:28.326 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:27:28.326 "dma_device_type": 2 00:27:28.326 } 00:27:28.326 ], 00:27:28.326 "driver_specific": { 00:27:28.326 "passthru": { 00:27:28.326 "name": "pt2", 00:27:28.326 "base_bdev_name": "malloc2" 00:27:28.326 } 00:27:28.326 } 00:27:28.326 }' 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # jq .block_size 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@205 -- # [[ 4128 == 4128 ]] 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # jq .md_size 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@206 -- # [[ 32 == 32 ]] 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.326 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # jq .md_interleave 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@207 -- # [[ true == true ]] 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # jq .dif_type 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@208 -- # [[ 0 == 0 ]] 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:28.585 16:06:33 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # jq -r '.[] | .uuid' 00:27:28.843 [2024-06-10 16:06:34.177917] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:28.843 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@486 -- # '[' 2faa8070-4ae6-4da9-8765-5f5ed5b8e97b '!=' 2faa8070-4ae6-4da9-8765-5f5ed5b8e97b ']' 00:27:28.843 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@490 -- # has_redundancy raid1 00:27:28.843 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@213 -- # case $1 in 00:27:28.843 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@214 -- # return 0 00:27:28.843 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@492 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt1 00:27:29.102 [2024-06-10 16:06:34.438406] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: pt1 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@495 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:29.102 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:29.361 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:29.361 "name": "raid_bdev1", 00:27:29.361 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:29.361 "strip_size_kb": 0, 00:27:29.361 "state": "online", 00:27:29.361 "raid_level": "raid1", 00:27:29.361 "superblock": true, 00:27:29.361 "num_base_bdevs": 2, 00:27:29.361 "num_base_bdevs_discovered": 1, 00:27:29.361 "num_base_bdevs_operational": 1, 00:27:29.361 "base_bdevs_list": [ 00:27:29.361 { 00:27:29.361 "name": null, 00:27:29.361 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:29.361 "is_configured": false, 00:27:29.361 "data_offset": 256, 00:27:29.361 "data_size": 7936 00:27:29.361 }, 00:27:29.361 { 00:27:29.361 "name": "pt2", 00:27:29.361 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:29.361 "is_configured": true, 00:27:29.361 "data_offset": 256, 00:27:29.361 "data_size": 7936 00:27:29.361 } 00:27:29.361 ] 00:27:29.361 }' 00:27:29.361 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:29.361 16:06:34 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:29.929 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@498 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:30.188 [2024-06-10 16:06:35.581441] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:30.188 [2024-06-10 16:06:35.581463] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:30.188 [2024-06-10 16:06:35.581508] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:30.188 [2024-06-10 16:06:35.581551] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:30.188 [2024-06-10 16:06:35.581560] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xa94680 name raid_bdev1, state offline 00:27:30.188 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.188 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # jq -r '.[]' 00:27:30.447 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@499 -- # raid_bdev= 00:27:30.447 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@500 -- # '[' -n '' ']' 00:27:30.447 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i = 1 )) 00:27:30.447 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:30.447 16:06:35 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@506 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete pt2 00:27:30.705 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i++ )) 00:27:30.705 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@505 -- # (( i < num_base_bdevs )) 00:27:30.705 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i = 1 )) 00:27:30.705 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@510 -- # (( i < num_base_bdevs - 1 )) 00:27:30.706 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@518 -- # i=1 00:27:30.706 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@519 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc2 -p pt2 -u 00000000-0000-0000-0000-000000000002 00:27:30.964 [2024-06-10 16:06:36.355462] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc2 00:27:30.964 [2024-06-10 16:06:36.355502] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:30.964 [2024-06-10 16:06:36.355517] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xc19950 00:27:30.964 [2024-06-10 16:06:36.355526] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:30.964 [2024-06-10 16:06:36.357016] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:30.964 [2024-06-10 16:06:36.357040] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt2 00:27:30.964 [2024-06-10 16:06:36.357082] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt2 00:27:30.964 [2024-06-10 16:06:36.357104] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:30.964 [2024-06-10 16:06:36.357167] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc1a310 00:27:30.964 [2024-06-10 16:06:36.357181] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:30.964 [2024-06-10 16:06:36.357234] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xa955a0 00:27:30.964 [2024-06-10 16:06:36.357306] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc1a310 00:27:30.964 [2024-06-10 16:06:36.357314] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc1a310 00:27:30.965 [2024-06-10 16:06:36.357365] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:30.965 pt2 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@522 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:30.965 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:31.223 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:31.223 "name": "raid_bdev1", 00:27:31.223 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:31.223 "strip_size_kb": 0, 00:27:31.223 "state": "online", 00:27:31.223 "raid_level": "raid1", 00:27:31.223 "superblock": true, 00:27:31.223 "num_base_bdevs": 2, 00:27:31.223 "num_base_bdevs_discovered": 1, 00:27:31.223 "num_base_bdevs_operational": 1, 00:27:31.223 "base_bdevs_list": [ 00:27:31.224 { 00:27:31.224 "name": null, 00:27:31.224 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:31.224 "is_configured": false, 00:27:31.224 "data_offset": 256, 00:27:31.224 "data_size": 7936 00:27:31.224 }, 00:27:31.224 { 00:27:31.224 "name": "pt2", 00:27:31.224 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:31.224 "is_configured": true, 00:27:31.224 "data_offset": 256, 00:27:31.224 "data_size": 7936 00:27:31.224 } 00:27:31.224 ] 00:27:31.224 }' 00:27:31.224 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:31.224 16:06:36 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:31.791 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@525 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:32.049 [2024-06-10 16:06:37.494497] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.049 [2024-06-10 16:06:37.494521] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:32.049 [2024-06-10 16:06:37.494574] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:32.049 [2024-06-10 16:06:37.494613] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:32.050 [2024-06-10 16:06:37.494622] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc1a310 name raid_bdev1, state offline 00:27:32.050 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.050 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # jq -r '.[]' 00:27:32.309 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@526 -- # raid_bdev= 00:27:32.309 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@527 -- # '[' -n '' ']' 00:27:32.309 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@531 -- # '[' 2 -gt 2 ']' 00:27:32.309 16:06:37 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@539 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b malloc1 -p pt1 -u 00000000-0000-0000-0000-000000000001 00:27:32.569 [2024-06-10 16:06:38.007850] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc1 00:27:32.569 [2024-06-10 16:06:38.007895] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:32.569 [2024-06-10 16:06:38.007911] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0xa952b0 00:27:32.569 [2024-06-10 16:06:38.007921] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:32.569 [2024-06-10 16:06:38.009429] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:32.569 [2024-06-10 16:06:38.009455] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt1 00:27:32.569 [2024-06-10 16:06:38.009497] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev pt1 00:27:32.569 [2024-06-10 16:06:38.009519] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt1 is claimed 00:27:32.569 [2024-06-10 16:06:38.009600] bdev_raid.c:3547:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev pt2 (4) greater than existing raid bdev raid_bdev1 (2) 00:27:32.569 [2024-06-10 16:06:38.009610] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:32.569 [2024-06-10 16:06:38.009622] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc1a990 name raid_bdev1, state configuring 00:27:32.569 [2024-06-10 16:06:38.009643] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev pt2 is claimed 00:27:32.569 [2024-06-10 16:06:38.009699] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xc1a990 00:27:32.569 [2024-06-10 16:06:38.009708] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:32.569 [2024-06-10 16:06:38.009762] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xc197f0 00:27:32.569 [2024-06-10 16:06:38.009836] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xc1a990 00:27:32.569 [2024-06-10 16:06:38.009843] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xc1a990 00:27:32.569 [2024-06-10 16:06:38.009903] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:32.569 pt1 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@541 -- # '[' 2 -gt 2 ']' 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@553 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:32.569 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:32.828 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:32.828 "name": "raid_bdev1", 00:27:32.828 "uuid": "2faa8070-4ae6-4da9-8765-5f5ed5b8e97b", 00:27:32.828 "strip_size_kb": 0, 00:27:32.828 "state": "online", 00:27:32.828 "raid_level": "raid1", 00:27:32.828 "superblock": true, 00:27:32.828 "num_base_bdevs": 2, 00:27:32.828 "num_base_bdevs_discovered": 1, 00:27:32.828 "num_base_bdevs_operational": 1, 00:27:32.828 "base_bdevs_list": [ 00:27:32.828 { 00:27:32.828 "name": null, 00:27:32.828 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:32.828 "is_configured": false, 00:27:32.828 "data_offset": 256, 00:27:32.828 "data_size": 7936 00:27:32.828 }, 00:27:32.828 { 00:27:32.828 "name": "pt2", 00:27:32.828 "uuid": "00000000-0000-0000-0000-000000000002", 00:27:32.828 "is_configured": true, 00:27:32.828 "data_offset": 256, 00:27:32.828 "data_size": 7936 00:27:32.828 } 00:27:32.828 ] 00:27:32.828 }' 00:27:32.828 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:32.828 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:33.765 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs online 00:27:33.765 16:06:38 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # jq -r '.[].base_bdevs_list[0].is_configured' 00:27:33.765 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@554 -- # [[ false == \f\a\l\s\e ]] 00:27:33.765 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # jq -r '.[] | .uuid' 00:27:33.765 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:34.024 [2024-06-10 16:06:39.395792] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@557 -- # '[' 2faa8070-4ae6-4da9-8765-5f5ed5b8e97b '!=' 2faa8070-4ae6-4da9-8765-5f5ed5b8e97b ']' 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@562 -- # killprocess 2825458 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 2825458 ']' 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 2825458 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2825458 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2825458' 00:27:34.024 killing process with pid 2825458 00:27:34.024 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@968 -- # kill 2825458 00:27:34.024 [2024-06-10 16:06:39.463857] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:27:34.024 [2024-06-10 16:06:39.463913] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:34.024 [2024-06-10 16:06:39.463953] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:34.024 [2024-06-10 16:06:39.463970] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xc1a990 name raid_bdev1, state offline 00:27:34.025 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@973 -- # wait 2825458 00:27:34.025 [2024-06-10 16:06:39.480565] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:27:34.284 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- bdev/bdev_raid.sh@564 -- # return 0 00:27:34.284 00:27:34.284 real 0m16.347s 00:27:34.284 user 0m30.302s 00:27:34.284 sys 0m2.403s 00:27:34.284 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:27:34.284 16:06:39 bdev_raid.raid_superblock_test_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:34.284 ************************************ 00:27:34.284 END TEST raid_superblock_test_md_interleaved 00:27:34.284 ************************************ 00:27:34.284 16:06:39 bdev_raid -- bdev/bdev_raid.sh@914 -- # run_test raid_rebuild_test_sb_md_interleaved raid_rebuild_test raid1 2 true false false 00:27:34.284 16:06:39 bdev_raid -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:27:34.284 16:06:39 bdev_raid -- common/autotest_common.sh@1106 -- # xtrace_disable 00:27:34.284 16:06:39 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:27:34.284 ************************************ 00:27:34.284 START TEST raid_rebuild_test_sb_md_interleaved 00:27:34.284 ************************************ 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1124 -- # raid_rebuild_test raid1 2 true false false 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@568 -- # local raid_level=raid1 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@569 -- # local num_base_bdevs=2 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@570 -- # local superblock=true 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@571 -- # local background_io=false 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@572 -- # local verify=false 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i = 1 )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev1 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # echo BaseBdev2 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i++ )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # (( i <= num_base_bdevs )) 00:27:34.284 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # base_bdevs=('BaseBdev1' 'BaseBdev2') 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@573 -- # local base_bdevs 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@574 -- # local raid_bdev_name=raid_bdev1 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@575 -- # local strip_size 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@576 -- # local create_arg 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@577 -- # local raid_bdev_size 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@578 -- # local data_offset 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@580 -- # '[' raid1 '!=' raid1 ']' 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@588 -- # strip_size=0 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@591 -- # '[' true = true ']' 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@592 -- # create_arg+=' -s' 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@596 -- # raid_pid=2828271 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@597 -- # waitforlisten 2828271 /var/tmp/spdk-raid.sock 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@595 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/spdk-raid.sock -T raid_bdev1 -t 60 -w randrw -M 50 -o 3M -q 2 -U -z -L bdev_raid 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@830 -- # '[' -z 2828271 ']' 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-raid.sock 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@835 -- # local max_retries=100 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock...' 00:27:34.285 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-raid.sock... 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@839 -- # xtrace_disable 00:27:34.285 16:06:39 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:34.544 [2024-06-10 16:06:39.817303] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:27:34.544 [2024-06-10 16:06:39.817361] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2828271 ] 00:27:34.544 I/O size of 3145728 is greater than zero copy threshold (65536). 00:27:34.544 Zero copy mechanism will not be used. 00:27:34.544 [2024-06-10 16:06:39.918383] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:27:34.544 [2024-06-10 16:06:40.013717] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:27:34.802 [2024-06-10 16:06:40.077744] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:34.802 [2024-06-10 16:06:40.077782] bdev_raid.c:1416:raid_bdev_get_ctx_size: *DEBUG*: raid_bdev_get_ctx_size 00:27:35.370 16:06:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:27:35.370 16:06:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@863 -- # return 0 00:27:35.370 16:06:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:35.370 16:06:40 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev1_malloc 00:27:35.630 BaseBdev1_malloc 00:27:35.630 16:06:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:35.889 [2024-06-10 16:06:41.263109] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:35.889 [2024-06-10 16:06:41.263151] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:35.889 [2024-06-10 16:06:41.263174] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1019d60 00:27:35.889 [2024-06-10 16:06:41.263183] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:35.889 [2024-06-10 16:06:41.264710] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:35.889 [2024-06-10 16:06:41.264735] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:35.889 BaseBdev1 00:27:35.889 16:06:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@600 -- # for bdev in "${base_bdevs[@]}" 00:27:35.889 16:06:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@601 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b BaseBdev2_malloc 00:27:36.148 BaseBdev2_malloc 00:27:36.148 16:06:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@602 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev2_malloc -p BaseBdev2 00:27:36.407 [2024-06-10 16:06:41.781380] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev2_malloc 00:27:36.407 [2024-06-10 16:06:41.781422] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:36.407 [2024-06-10 16:06:41.781442] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1011580 00:27:36.407 [2024-06-10 16:06:41.781452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:36.407 [2024-06-10 16:06:41.783160] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:36.407 [2024-06-10 16:06:41.783190] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev2 00:27:36.407 BaseBdev2 00:27:36.407 16:06:41 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@606 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_malloc_create 32 4096 -m 32 -i -b spare_malloc 00:27:36.666 spare_malloc 00:27:36.666 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@607 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_delay_create -b spare_malloc -d spare_delay -r 0 -t 0 -w 100000 -n 100000 00:27:36.925 spare_delay 00:27:36.925 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@608 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:37.183 [2024-06-10 16:06:42.552393] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:37.183 [2024-06-10 16:06:42.552433] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:37.183 [2024-06-10 16:06:42.552454] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1014630 00:27:37.183 [2024-06-10 16:06:42.552464] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:37.183 [2024-06-10 16:06:42.553867] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:37.183 [2024-06-10 16:06:42.553892] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:37.183 spare 00:27:37.183 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@611 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_create -s -r raid1 -b 'BaseBdev1 BaseBdev2' -n raid_bdev1 00:27:37.442 [2024-06-10 16:06:42.805097] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:27:37.442 [2024-06-10 16:06:42.806449] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:37.442 [2024-06-10 16:06:42.806614] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0x1014d10 00:27:37.442 [2024-06-10 16:06:42.806627] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:37.442 [2024-06-10 16:06:42.806695] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xe7ca40 00:27:37.442 [2024-06-10 16:06:42.806781] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0x1014d10 00:27:37.442 [2024-06-10 16:06:42.806790] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0x1014d10 00:27:37.442 [2024-06-10 16:06:42.806849] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@612 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:37.442 16:06:42 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:37.701 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:37.701 "name": "raid_bdev1", 00:27:37.701 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:37.701 "strip_size_kb": 0, 00:27:37.701 "state": "online", 00:27:37.701 "raid_level": "raid1", 00:27:37.701 "superblock": true, 00:27:37.701 "num_base_bdevs": 2, 00:27:37.701 "num_base_bdevs_discovered": 2, 00:27:37.701 "num_base_bdevs_operational": 2, 00:27:37.701 "base_bdevs_list": [ 00:27:37.701 { 00:27:37.701 "name": "BaseBdev1", 00:27:37.701 "uuid": "e0505182-1a6a-5d90-ab56-9b72dcab8704", 00:27:37.701 "is_configured": true, 00:27:37.701 "data_offset": 256, 00:27:37.701 "data_size": 7936 00:27:37.701 }, 00:27:37.701 { 00:27:37.701 "name": "BaseBdev2", 00:27:37.701 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:37.701 "is_configured": true, 00:27:37.701 "data_offset": 256, 00:27:37.701 "data_size": 7936 00:27:37.701 } 00:27:37.701 ] 00:27:37.701 }' 00:27:37.701 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:37.701 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:38.269 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_get_bdevs -b raid_bdev1 00:27:38.269 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # jq -r '.[].num_blocks' 00:27:38.527 [2024-06-10 16:06:43.932330] bdev_raid.c:1107:raid_bdev_dump_info_json: *DEBUG*: raid_bdev_dump_config_json 00:27:38.527 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@615 -- # raid_bdev_size=7936 00:27:38.527 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:38.527 16:06:43 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # jq -r '.[].base_bdevs_list[0].data_offset' 00:27:38.785 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@618 -- # data_offset=256 00:27:38.785 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@620 -- # '[' false = true ']' 00:27:38.785 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@623 -- # '[' false = true ']' 00:27:38.785 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@639 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev BaseBdev1 00:27:39.044 [2024-06-10 16:06:44.441450] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: BaseBdev1 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@642 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:39.044 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:39.304 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:39.304 "name": "raid_bdev1", 00:27:39.304 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:39.304 "strip_size_kb": 0, 00:27:39.304 "state": "online", 00:27:39.304 "raid_level": "raid1", 00:27:39.304 "superblock": true, 00:27:39.304 "num_base_bdevs": 2, 00:27:39.304 "num_base_bdevs_discovered": 1, 00:27:39.304 "num_base_bdevs_operational": 1, 00:27:39.304 "base_bdevs_list": [ 00:27:39.304 { 00:27:39.304 "name": null, 00:27:39.304 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:39.304 "is_configured": false, 00:27:39.304 "data_offset": 256, 00:27:39.304 "data_size": 7936 00:27:39.304 }, 00:27:39.304 { 00:27:39.304 "name": "BaseBdev2", 00:27:39.304 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:39.304 "is_configured": true, 00:27:39.304 "data_offset": 256, 00:27:39.304 "data_size": 7936 00:27:39.304 } 00:27:39.304 ] 00:27:39.304 }' 00:27:39.304 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:39.304 16:06:44 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:39.872 16:06:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@645 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:40.130 [2024-06-10 16:06:45.564474] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:40.130 [2024-06-10 16:06:45.568188] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10166d0 00:27:40.130 [2024-06-10 16:06:45.570009] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:40.130 16:06:45 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@646 -- # sleep 1 00:27:41.508 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@649 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:41.508 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:41.509 "name": "raid_bdev1", 00:27:41.509 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:41.509 "strip_size_kb": 0, 00:27:41.509 "state": "online", 00:27:41.509 "raid_level": "raid1", 00:27:41.509 "superblock": true, 00:27:41.509 "num_base_bdevs": 2, 00:27:41.509 "num_base_bdevs_discovered": 2, 00:27:41.509 "num_base_bdevs_operational": 2, 00:27:41.509 "process": { 00:27:41.509 "type": "rebuild", 00:27:41.509 "target": "spare", 00:27:41.509 "progress": { 00:27:41.509 "blocks": 3072, 00:27:41.509 "percent": 38 00:27:41.509 } 00:27:41.509 }, 00:27:41.509 "base_bdevs_list": [ 00:27:41.509 { 00:27:41.509 "name": "spare", 00:27:41.509 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:41.509 "is_configured": true, 00:27:41.509 "data_offset": 256, 00:27:41.509 "data_size": 7936 00:27:41.509 }, 00:27:41.509 { 00:27:41.509 "name": "BaseBdev2", 00:27:41.509 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:41.509 "is_configured": true, 00:27:41.509 "data_offset": 256, 00:27:41.509 "data_size": 7936 00:27:41.509 } 00:27:41.509 ] 00:27:41.509 }' 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:41.509 16:06:46 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:41.766 [2024-06-10 16:06:47.191131] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.023 [2024-06-10 16:06:47.282983] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:42.023 [2024-06-10 16:06:47.283030] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:42.023 [2024-06-10 16:06:47.283044] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:42.023 [2024-06-10 16:06:47.283050] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:42.023 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@655 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:42.023 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:42.023 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:42.023 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.024 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:42.281 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:42.281 "name": "raid_bdev1", 00:27:42.281 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:42.281 "strip_size_kb": 0, 00:27:42.281 "state": "online", 00:27:42.281 "raid_level": "raid1", 00:27:42.281 "superblock": true, 00:27:42.281 "num_base_bdevs": 2, 00:27:42.281 "num_base_bdevs_discovered": 1, 00:27:42.281 "num_base_bdevs_operational": 1, 00:27:42.281 "base_bdevs_list": [ 00:27:42.281 { 00:27:42.281 "name": null, 00:27:42.281 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:42.281 "is_configured": false, 00:27:42.281 "data_offset": 256, 00:27:42.281 "data_size": 7936 00:27:42.281 }, 00:27:42.281 { 00:27:42.281 "name": "BaseBdev2", 00:27:42.281 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:42.281 "is_configured": true, 00:27:42.281 "data_offset": 256, 00:27:42.281 "data_size": 7936 00:27:42.281 } 00:27:42.281 ] 00:27:42.281 }' 00:27:42.281 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:42.281 16:06:47 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@658 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:42.908 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:43.167 "name": "raid_bdev1", 00:27:43.167 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:43.167 "strip_size_kb": 0, 00:27:43.167 "state": "online", 00:27:43.167 "raid_level": "raid1", 00:27:43.167 "superblock": true, 00:27:43.167 "num_base_bdevs": 2, 00:27:43.167 "num_base_bdevs_discovered": 1, 00:27:43.167 "num_base_bdevs_operational": 1, 00:27:43.167 "base_bdevs_list": [ 00:27:43.167 { 00:27:43.167 "name": null, 00:27:43.167 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:43.167 "is_configured": false, 00:27:43.167 "data_offset": 256, 00:27:43.167 "data_size": 7936 00:27:43.167 }, 00:27:43.167 { 00:27:43.167 "name": "BaseBdev2", 00:27:43.167 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:43.167 "is_configured": true, 00:27:43.167 "data_offset": 256, 00:27:43.167 "data_size": 7936 00:27:43.167 } 00:27:43.167 ] 00:27:43.167 }' 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:43.167 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@661 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:43.426 [2024-06-10 16:06:48.774615] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:43.426 [2024-06-10 16:06:48.778099] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x10150f0 00:27:43.426 [2024-06-10 16:06:48.779614] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:43.426 16:06:48 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@662 -- # sleep 1 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@663 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.362 16:06:49 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:44.621 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:44.621 "name": "raid_bdev1", 00:27:44.621 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:44.621 "strip_size_kb": 0, 00:27:44.621 "state": "online", 00:27:44.621 "raid_level": "raid1", 00:27:44.621 "superblock": true, 00:27:44.621 "num_base_bdevs": 2, 00:27:44.621 "num_base_bdevs_discovered": 2, 00:27:44.621 "num_base_bdevs_operational": 2, 00:27:44.621 "process": { 00:27:44.621 "type": "rebuild", 00:27:44.621 "target": "spare", 00:27:44.621 "progress": { 00:27:44.621 "blocks": 3072, 00:27:44.621 "percent": 38 00:27:44.621 } 00:27:44.621 }, 00:27:44.621 "base_bdevs_list": [ 00:27:44.621 { 00:27:44.621 "name": "spare", 00:27:44.621 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:44.621 "is_configured": true, 00:27:44.621 "data_offset": 256, 00:27:44.621 "data_size": 7936 00:27:44.621 }, 00:27:44.621 { 00:27:44.621 "name": "BaseBdev2", 00:27:44.621 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:44.621 "is_configured": true, 00:27:44.621 "data_offset": 256, 00:27:44.621 "data_size": 7936 00:27:44.621 } 00:27:44.621 ] 00:27:44.621 }' 00:27:44.621 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:44.621 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:44.621 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' true = true ']' 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@665 -- # '[' = false ']' 00:27:44.880 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev_raid.sh: line 665: [: =: unary operator expected 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@690 -- # local num_base_bdevs_operational=2 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' raid1 = raid1 ']' 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@692 -- # '[' 2 -gt 2 ']' 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@705 -- # local timeout=1132 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:44.880 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:45.139 "name": "raid_bdev1", 00:27:45.139 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:45.139 "strip_size_kb": 0, 00:27:45.139 "state": "online", 00:27:45.139 "raid_level": "raid1", 00:27:45.139 "superblock": true, 00:27:45.139 "num_base_bdevs": 2, 00:27:45.139 "num_base_bdevs_discovered": 2, 00:27:45.139 "num_base_bdevs_operational": 2, 00:27:45.139 "process": { 00:27:45.139 "type": "rebuild", 00:27:45.139 "target": "spare", 00:27:45.139 "progress": { 00:27:45.139 "blocks": 3840, 00:27:45.139 "percent": 48 00:27:45.139 } 00:27:45.139 }, 00:27:45.139 "base_bdevs_list": [ 00:27:45.139 { 00:27:45.139 "name": "spare", 00:27:45.139 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:45.139 "is_configured": true, 00:27:45.139 "data_offset": 256, 00:27:45.139 "data_size": 7936 00:27:45.139 }, 00:27:45.139 { 00:27:45.139 "name": "BaseBdev2", 00:27:45.139 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:45.139 "is_configured": true, 00:27:45.139 "data_offset": 256, 00:27:45.139 "data_size": 7936 00:27:45.139 } 00:27:45.139 ] 00:27:45.139 }' 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:45.139 16:06:50 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:46.077 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:46.336 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:46.336 "name": "raid_bdev1", 00:27:46.336 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:46.336 "strip_size_kb": 0, 00:27:46.336 "state": "online", 00:27:46.336 "raid_level": "raid1", 00:27:46.336 "superblock": true, 00:27:46.336 "num_base_bdevs": 2, 00:27:46.336 "num_base_bdevs_discovered": 2, 00:27:46.336 "num_base_bdevs_operational": 2, 00:27:46.336 "process": { 00:27:46.336 "type": "rebuild", 00:27:46.336 "target": "spare", 00:27:46.336 "progress": { 00:27:46.336 "blocks": 7424, 00:27:46.336 "percent": 93 00:27:46.336 } 00:27:46.336 }, 00:27:46.336 "base_bdevs_list": [ 00:27:46.336 { 00:27:46.336 "name": "spare", 00:27:46.336 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:46.336 "is_configured": true, 00:27:46.336 "data_offset": 256, 00:27:46.336 "data_size": 7936 00:27:46.336 }, 00:27:46.336 { 00:27:46.336 "name": "BaseBdev2", 00:27:46.336 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:46.336 "is_configured": true, 00:27:46.336 "data_offset": 256, 00:27:46.336 "data_size": 7936 00:27:46.336 } 00:27:46.336 ] 00:27:46.336 }' 00:27:46.336 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:46.336 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:46.336 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:46.595 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:46.595 16:06:51 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@710 -- # sleep 1 00:27:46.595 [2024-06-10 16:06:51.903304] bdev_raid.c:2789:raid_bdev_process_thread_run: *DEBUG*: process completed on raid_bdev1 00:27:46.595 [2024-06-10 16:06:51.903367] bdev_raid.c:2504:raid_bdev_process_finish_done: *NOTICE*: Finished rebuild on raid bdev raid_bdev1 00:27:46.595 [2024-06-10 16:06:51.903448] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@706 -- # (( SECONDS < timeout )) 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@707 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.530 16:06:52 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:47.789 "name": "raid_bdev1", 00:27:47.789 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:47.789 "strip_size_kb": 0, 00:27:47.789 "state": "online", 00:27:47.789 "raid_level": "raid1", 00:27:47.789 "superblock": true, 00:27:47.789 "num_base_bdevs": 2, 00:27:47.789 "num_base_bdevs_discovered": 2, 00:27:47.789 "num_base_bdevs_operational": 2, 00:27:47.789 "base_bdevs_list": [ 00:27:47.789 { 00:27:47.789 "name": "spare", 00:27:47.789 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:47.789 "is_configured": true, 00:27:47.789 "data_offset": 256, 00:27:47.789 "data_size": 7936 00:27:47.789 }, 00:27:47.789 { 00:27:47.789 "name": "BaseBdev2", 00:27:47.789 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:47.789 "is_configured": true, 00:27:47.789 "data_offset": 256, 00:27:47.789 "data_size": 7936 00:27:47.789 } 00:27:47.789 ] 00:27:47.789 }' 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \r\e\b\u\i\l\d ]] 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \s\p\a\r\e ]] 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@708 -- # break 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@714 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:47.789 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.048 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:48.048 "name": "raid_bdev1", 00:27:48.048 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:48.048 "strip_size_kb": 0, 00:27:48.048 "state": "online", 00:27:48.048 "raid_level": "raid1", 00:27:48.048 "superblock": true, 00:27:48.048 "num_base_bdevs": 2, 00:27:48.048 "num_base_bdevs_discovered": 2, 00:27:48.048 "num_base_bdevs_operational": 2, 00:27:48.048 "base_bdevs_list": [ 00:27:48.048 { 00:27:48.048 "name": "spare", 00:27:48.048 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:48.048 "is_configured": true, 00:27:48.048 "data_offset": 256, 00:27:48.048 "data_size": 7936 00:27:48.048 }, 00:27:48.048 { 00:27:48.048 "name": "BaseBdev2", 00:27:48.048 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:48.048 "is_configured": true, 00:27:48.048 "data_offset": 256, 00:27:48.048 "data_size": 7936 00:27:48.048 } 00:27:48.048 ] 00:27:48.048 }' 00:27:48.048 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:48.048 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:48.048 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:48.048 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@715 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:48.306 "name": "raid_bdev1", 00:27:48.306 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:48.306 "strip_size_kb": 0, 00:27:48.306 "state": "online", 00:27:48.306 "raid_level": "raid1", 00:27:48.306 "superblock": true, 00:27:48.306 "num_base_bdevs": 2, 00:27:48.306 "num_base_bdevs_discovered": 2, 00:27:48.306 "num_base_bdevs_operational": 2, 00:27:48.306 "base_bdevs_list": [ 00:27:48.306 { 00:27:48.306 "name": "spare", 00:27:48.306 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:48.306 "is_configured": true, 00:27:48.306 "data_offset": 256, 00:27:48.306 "data_size": 7936 00:27:48.306 }, 00:27:48.306 { 00:27:48.306 "name": "BaseBdev2", 00:27:48.306 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:48.306 "is_configured": true, 00:27:48.306 "data_offset": 256, 00:27:48.306 "data_size": 7936 00:27:48.306 } 00:27:48.306 ] 00:27:48.306 }' 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:48.306 16:06:53 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:49.243 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@718 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_delete raid_bdev1 00:27:49.243 [2024-06-10 16:06:54.671376] bdev_raid.c:2356:raid_bdev_delete: *DEBUG*: delete raid bdev: raid_bdev1 00:27:49.243 [2024-06-10 16:06:54.671402] bdev_raid.c:1844:raid_bdev_deconfigure: *DEBUG*: raid bdev state changing from online to offline 00:27:49.243 [2024-06-10 16:06:54.671457] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:27:49.243 [2024-06-10 16:06:54.671511] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:27:49.243 [2024-06-10 16:06:54.671520] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0x1014d10 name raid_bdev1, state offline 00:27:49.243 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:49.243 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # jq length 00:27:49.502 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@719 -- # [[ 0 == 0 ]] 00:27:49.502 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@721 -- # '[' false = true ']' 00:27:49.502 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@742 -- # '[' true = true ']' 00:27:49.502 16:06:54 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@744 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:49.761 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@745 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:50.019 [2024-06-10 16:06:55.441382] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:50.019 [2024-06-10 16:06:55.441421] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:50.019 [2024-06-10 16:06:55.441441] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x1014860 00:27:50.019 [2024-06-10 16:06:55.441452] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:50.019 [2024-06-10 16:06:55.442992] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:50.019 [2024-06-10 16:06:55.443018] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:50.019 [2024-06-10 16:06:55.443071] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:50.019 [2024-06-10 16:06:55.443094] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:50.019 [2024-06-10 16:06:55.443180] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev2 is claimed 00:27:50.019 spare 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@747 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 2 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=2 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.019 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:50.278 [2024-06-10 16:06:55.543492] bdev_raid.c:1694:raid_bdev_configure_cont: *DEBUG*: io device register 0xe7d2a0 00:27:50.278 [2024-06-10 16:06:55.543506] bdev_raid.c:1695:raid_bdev_configure_cont: *DEBUG*: blockcnt 7936, blocklen 4128 00:27:50.278 [2024-06-10 16:06:55.543571] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1018560 00:27:50.278 [2024-06-10 16:06:55.543659] bdev_raid.c:1724:raid_bdev_configure_cont: *DEBUG*: raid bdev generic 0xe7d2a0 00:27:50.278 [2024-06-10 16:06:55.543668] bdev_raid.c:1725:raid_bdev_configure_cont: *DEBUG*: raid bdev is created with name raid_bdev1, raid_bdev 0xe7d2a0 00:27:50.278 [2024-06-10 16:06:55.543730] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:50.278 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:50.278 "name": "raid_bdev1", 00:27:50.278 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:50.278 "strip_size_kb": 0, 00:27:50.278 "state": "online", 00:27:50.278 "raid_level": "raid1", 00:27:50.278 "superblock": true, 00:27:50.278 "num_base_bdevs": 2, 00:27:50.278 "num_base_bdevs_discovered": 2, 00:27:50.278 "num_base_bdevs_operational": 2, 00:27:50.278 "base_bdevs_list": [ 00:27:50.278 { 00:27:50.278 "name": "spare", 00:27:50.278 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:50.278 "is_configured": true, 00:27:50.278 "data_offset": 256, 00:27:50.278 "data_size": 7936 00:27:50.278 }, 00:27:50.278 { 00:27:50.278 "name": "BaseBdev2", 00:27:50.278 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:50.278 "is_configured": true, 00:27:50.278 "data_offset": 256, 00:27:50.278 "data_size": 7936 00:27:50.278 } 00:27:50.278 ] 00:27:50.278 }' 00:27:50.278 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:50.278 16:06:55 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@748 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:50.845 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:51.103 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:51.103 "name": "raid_bdev1", 00:27:51.103 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:51.103 "strip_size_kb": 0, 00:27:51.103 "state": "online", 00:27:51.103 "raid_level": "raid1", 00:27:51.103 "superblock": true, 00:27:51.103 "num_base_bdevs": 2, 00:27:51.103 "num_base_bdevs_discovered": 2, 00:27:51.103 "num_base_bdevs_operational": 2, 00:27:51.103 "base_bdevs_list": [ 00:27:51.103 { 00:27:51.103 "name": "spare", 00:27:51.103 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:51.103 "is_configured": true, 00:27:51.103 "data_offset": 256, 00:27:51.103 "data_size": 7936 00:27:51.103 }, 00:27:51.103 { 00:27:51.103 "name": "BaseBdev2", 00:27:51.103 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:51.103 "is_configured": true, 00:27:51.103 "data_offset": 256, 00:27:51.103 "data_size": 7936 00:27:51.103 } 00:27:51.103 ] 00:27:51.104 }' 00:27:51.104 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:51.362 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:51.362 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:51.362 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:51.362 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.362 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # jq -r '.[].base_bdevs_list[0].name' 00:27:51.621 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@749 -- # [[ spare == \s\p\a\r\e ]] 00:27:51.621 16:06:56 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@752 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_remove_base_bdev spare 00:27:51.879 [2024-06-10 16:06:57.178157] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@753 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:51.879 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:52.138 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:52.138 "name": "raid_bdev1", 00:27:52.138 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:52.138 "strip_size_kb": 0, 00:27:52.138 "state": "online", 00:27:52.138 "raid_level": "raid1", 00:27:52.138 "superblock": true, 00:27:52.138 "num_base_bdevs": 2, 00:27:52.138 "num_base_bdevs_discovered": 1, 00:27:52.138 "num_base_bdevs_operational": 1, 00:27:52.138 "base_bdevs_list": [ 00:27:52.138 { 00:27:52.138 "name": null, 00:27:52.138 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:52.138 "is_configured": false, 00:27:52.138 "data_offset": 256, 00:27:52.138 "data_size": 7936 00:27:52.138 }, 00:27:52.138 { 00:27:52.138 "name": "BaseBdev2", 00:27:52.138 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:52.138 "is_configured": true, 00:27:52.138 "data_offset": 256, 00:27:52.138 "data_size": 7936 00:27:52.138 } 00:27:52.138 ] 00:27:52.138 }' 00:27:52.138 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:52.138 16:06:57 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:52.705 16:06:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@754 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 spare 00:27:52.964 [2024-06-10 16:06:58.305185] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:52.964 [2024-06-10 16:06:58.305326] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:52.964 [2024-06-10 16:06:58.305340] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:52.964 [2024-06-10 16:06:58.305363] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:52.964 [2024-06-10 16:06:58.308726] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0x1017970 00:27:52.964 [2024-06-10 16:06:58.310202] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:52.964 16:06:58 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@755 -- # sleep 1 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@756 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:53.898 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.156 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:54.156 "name": "raid_bdev1", 00:27:54.156 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:54.156 "strip_size_kb": 0, 00:27:54.156 "state": "online", 00:27:54.156 "raid_level": "raid1", 00:27:54.156 "superblock": true, 00:27:54.156 "num_base_bdevs": 2, 00:27:54.156 "num_base_bdevs_discovered": 2, 00:27:54.156 "num_base_bdevs_operational": 2, 00:27:54.156 "process": { 00:27:54.156 "type": "rebuild", 00:27:54.156 "target": "spare", 00:27:54.156 "progress": { 00:27:54.156 "blocks": 3072, 00:27:54.156 "percent": 38 00:27:54.156 } 00:27:54.156 }, 00:27:54.156 "base_bdevs_list": [ 00:27:54.156 { 00:27:54.156 "name": "spare", 00:27:54.156 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:54.156 "is_configured": true, 00:27:54.156 "data_offset": 256, 00:27:54.156 "data_size": 7936 00:27:54.156 }, 00:27:54.156 { 00:27:54.156 "name": "BaseBdev2", 00:27:54.156 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:54.156 "is_configured": true, 00:27:54.156 "data_offset": 256, 00:27:54.156 "data_size": 7936 00:27:54.156 } 00:27:54.156 ] 00:27:54.156 }' 00:27:54.156 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:54.156 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:54.156 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:54.415 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:54.415 16:06:59 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@759 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:54.674 [2024-06-10 16:06:59.931490] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.674 [2024-06-10 16:07:00.023270] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:54.674 [2024-06-10 16:07:00.023314] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:54.674 [2024-06-10 16:07:00.023329] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:54.674 [2024-06-10 16:07:00.023337] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@760 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:54.674 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:54.933 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:54.933 "name": "raid_bdev1", 00:27:54.933 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:54.933 "strip_size_kb": 0, 00:27:54.933 "state": "online", 00:27:54.933 "raid_level": "raid1", 00:27:54.933 "superblock": true, 00:27:54.933 "num_base_bdevs": 2, 00:27:54.933 "num_base_bdevs_discovered": 1, 00:27:54.933 "num_base_bdevs_operational": 1, 00:27:54.933 "base_bdevs_list": [ 00:27:54.933 { 00:27:54.933 "name": null, 00:27:54.933 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:54.933 "is_configured": false, 00:27:54.933 "data_offset": 256, 00:27:54.933 "data_size": 7936 00:27:54.933 }, 00:27:54.933 { 00:27:54.933 "name": "BaseBdev2", 00:27:54.933 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:54.933 "is_configured": true, 00:27:54.933 "data_offset": 256, 00:27:54.933 "data_size": 7936 00:27:54.933 } 00:27:54.933 ] 00:27:54.933 }' 00:27:54.933 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:54.933 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:55.500 16:07:00 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@761 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b spare_delay -p spare 00:27:55.759 [2024-06-10 16:07:01.150059] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on spare_delay 00:27:55.759 [2024-06-10 16:07:01.150104] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:55.759 [2024-06-10 16:07:01.150124] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10172d0 00:27:55.759 [2024-06-10 16:07:01.150133] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:55.759 [2024-06-10 16:07:01.150317] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:55.759 [2024-06-10 16:07:01.150331] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: spare 00:27:55.759 [2024-06-10 16:07:01.150384] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev spare 00:27:55.759 [2024-06-10 16:07:01.150394] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev spare (4) smaller than existing raid bdev raid_bdev1 (5) 00:27:55.759 [2024-06-10 16:07:01.150402] bdev_raid.c:3620:raid_bdev_examine_sb: *NOTICE*: Re-adding bdev spare to raid bdev raid_bdev1. 00:27:55.759 [2024-06-10 16:07:01.150418] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev spare is claimed 00:27:55.759 [2024-06-10 16:07:01.153774] bdev_raid.c: 251:raid_bdev_create_cb: *DEBUG*: raid_bdev_create_cb, 0xffec50 00:27:55.759 [2024-06-10 16:07:01.155195] bdev_raid.c:2824:raid_bdev_process_thread_init: *NOTICE*: Started rebuild on raid bdev raid_bdev1 00:27:55.759 spare 00:27:55.759 16:07:01 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@762 -- # sleep 1 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@763 -- # verify_raid_bdev_process raid_bdev1 rebuild spare 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=rebuild 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=spare 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:56.695 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:56.953 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:56.953 "name": "raid_bdev1", 00:27:56.953 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:56.953 "strip_size_kb": 0, 00:27:56.953 "state": "online", 00:27:56.953 "raid_level": "raid1", 00:27:56.953 "superblock": true, 00:27:56.953 "num_base_bdevs": 2, 00:27:56.953 "num_base_bdevs_discovered": 2, 00:27:56.953 "num_base_bdevs_operational": 2, 00:27:56.953 "process": { 00:27:56.953 "type": "rebuild", 00:27:56.953 "target": "spare", 00:27:56.953 "progress": { 00:27:56.953 "blocks": 3072, 00:27:56.953 "percent": 38 00:27:56.953 } 00:27:56.953 }, 00:27:56.953 "base_bdevs_list": [ 00:27:56.953 { 00:27:56.953 "name": "spare", 00:27:56.953 "uuid": "09c6dcb6-e3b6-52de-b6ef-c78d53ea0abb", 00:27:56.953 "is_configured": true, 00:27:56.953 "data_offset": 256, 00:27:56.953 "data_size": 7936 00:27:56.953 }, 00:27:56.953 { 00:27:56.953 "name": "BaseBdev2", 00:27:56.953 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:56.953 "is_configured": true, 00:27:56.953 "data_offset": 256, 00:27:56.953 "data_size": 7936 00:27:56.953 } 00:27:56.953 ] 00:27:56.953 }' 00:27:56.953 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:57.210 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ rebuild == \r\e\b\u\i\l\d ]] 00:27:57.210 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:57.210 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ spare == \s\p\a\r\e ]] 00:27:57.211 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@766 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete spare 00:27:57.469 [2024-06-10 16:07:02.776344] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:57.469 [2024-06-10 16:07:02.868121] bdev_raid.c:2513:raid_bdev_process_finish_done: *WARNING*: Finished rebuild on raid bdev raid_bdev1: No such device 00:27:57.469 [2024-06-10 16:07:02.868165] bdev_raid.c: 331:raid_bdev_destroy_cb: *DEBUG*: raid_bdev_destroy_cb 00:27:57.469 [2024-06-10 16:07:02.868178] bdev_raid.c:2120:_raid_bdev_remove_base_bdev: *DEBUG*: spare 00:27:57.469 [2024-06-10 16:07:02.868185] bdev_raid.c:2451:raid_bdev_process_finish_target_removed: *ERROR*: Failed to remove target bdev: No such device 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@767 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:57.469 16:07:02 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:57.762 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:27:57.762 "name": "raid_bdev1", 00:27:57.762 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:57.762 "strip_size_kb": 0, 00:27:57.762 "state": "online", 00:27:57.762 "raid_level": "raid1", 00:27:57.762 "superblock": true, 00:27:57.762 "num_base_bdevs": 2, 00:27:57.762 "num_base_bdevs_discovered": 1, 00:27:57.762 "num_base_bdevs_operational": 1, 00:27:57.762 "base_bdevs_list": [ 00:27:57.762 { 00:27:57.762 "name": null, 00:27:57.762 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:57.762 "is_configured": false, 00:27:57.762 "data_offset": 256, 00:27:57.762 "data_size": 7936 00:27:57.762 }, 00:27:57.762 { 00:27:57.762 "name": "BaseBdev2", 00:27:57.762 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:57.762 "is_configured": true, 00:27:57.762 "data_offset": 256, 00:27:57.762 "data_size": 7936 00:27:57.762 } 00:27:57.762 ] 00:27:57.762 }' 00:27:57.762 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:27:57.762 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:27:58.329 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@768 -- # verify_raid_bdev_process raid_bdev1 none none 00:27:58.329 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:27:58.329 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:27:58.329 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:27:58.330 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:27:58.330 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:27:58.330 16:07:03 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:27:58.588 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:27:58.588 "name": "raid_bdev1", 00:27:58.588 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:27:58.588 "strip_size_kb": 0, 00:27:58.588 "state": "online", 00:27:58.588 "raid_level": "raid1", 00:27:58.588 "superblock": true, 00:27:58.588 "num_base_bdevs": 2, 00:27:58.588 "num_base_bdevs_discovered": 1, 00:27:58.588 "num_base_bdevs_operational": 1, 00:27:58.588 "base_bdevs_list": [ 00:27:58.588 { 00:27:58.588 "name": null, 00:27:58.588 "uuid": "00000000-0000-0000-0000-000000000000", 00:27:58.588 "is_configured": false, 00:27:58.588 "data_offset": 256, 00:27:58.588 "data_size": 7936 00:27:58.588 }, 00:27:58.588 { 00:27:58.588 "name": "BaseBdev2", 00:27:58.588 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:27:58.588 "is_configured": true, 00:27:58.588 "data_offset": 256, 00:27:58.588 "data_size": 7936 00:27:58.588 } 00:27:58.588 ] 00:27:58.588 }' 00:27:58.588 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:27:58.588 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:27:58.588 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:27:58.846 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:27:58.846 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@771 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_delete BaseBdev1 00:27:59.105 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@772 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_passthru_create -b BaseBdev1_malloc -p BaseBdev1 00:27:59.364 [2024-06-10 16:07:04.620411] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on BaseBdev1_malloc 00:27:59.364 [2024-06-10 16:07:04.620451] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:27:59.364 [2024-06-10 16:07:04.620471] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x10037b0 00:27:59.364 [2024-06-10 16:07:04.620481] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:27:59.364 [2024-06-10 16:07:04.620635] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:27:59.364 [2024-06-10 16:07:04.620648] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: BaseBdev1 00:27:59.364 [2024-06-10 16:07:04.620691] bdev_raid.c:3752:raid_bdev_examine_cont: *DEBUG*: raid superblock found on bdev BaseBdev1 00:27:59.364 [2024-06-10 16:07:04.620700] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:27:59.364 [2024-06-10 16:07:04.620708] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:27:59.364 BaseBdev1 00:27:59.364 16:07:04 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@773 -- # sleep 1 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@774 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:00.301 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:00.560 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:00.560 "name": "raid_bdev1", 00:28:00.560 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:28:00.560 "strip_size_kb": 0, 00:28:00.560 "state": "online", 00:28:00.560 "raid_level": "raid1", 00:28:00.561 "superblock": true, 00:28:00.561 "num_base_bdevs": 2, 00:28:00.561 "num_base_bdevs_discovered": 1, 00:28:00.561 "num_base_bdevs_operational": 1, 00:28:00.561 "base_bdevs_list": [ 00:28:00.561 { 00:28:00.561 "name": null, 00:28:00.561 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:00.561 "is_configured": false, 00:28:00.561 "data_offset": 256, 00:28:00.561 "data_size": 7936 00:28:00.561 }, 00:28:00.561 { 00:28:00.561 "name": "BaseBdev2", 00:28:00.561 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:28:00.561 "is_configured": true, 00:28:00.561 "data_offset": 256, 00:28:00.561 "data_size": 7936 00:28:00.561 } 00:28:00.561 ] 00:28:00.561 }' 00:28:00.561 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:00.561 16:07:05 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@775 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:01.129 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:01.388 "name": "raid_bdev1", 00:28:01.388 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:28:01.388 "strip_size_kb": 0, 00:28:01.388 "state": "online", 00:28:01.388 "raid_level": "raid1", 00:28:01.388 "superblock": true, 00:28:01.388 "num_base_bdevs": 2, 00:28:01.388 "num_base_bdevs_discovered": 1, 00:28:01.388 "num_base_bdevs_operational": 1, 00:28:01.388 "base_bdevs_list": [ 00:28:01.388 { 00:28:01.388 "name": null, 00:28:01.388 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:01.388 "is_configured": false, 00:28:01.388 "data_offset": 256, 00:28:01.388 "data_size": 7936 00:28:01.388 }, 00:28:01.388 { 00:28:01.388 "name": "BaseBdev2", 00:28:01.388 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:28:01.388 "is_configured": true, 00:28:01.388 "data_offset": 256, 00:28:01.388 "data_size": 7936 00:28:01.388 } 00:28:01.388 ] 00:28:01.388 }' 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@776 -- # NOT /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@649 -- # local es=0 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@651 -- # valid_exec_arg /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@637 -- # local arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # type -t /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:01.388 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:01.389 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # type -P /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:01.389 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@641 -- # case "$(type -t "$arg")" in 00:28:01.389 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # arg=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:01.389 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@643 -- # [[ -x /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py ]] 00:28:01.389 16:07:06 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_add_base_bdev raid_bdev1 BaseBdev1 00:28:01.648 [2024-06-10 16:07:07.123159] bdev_raid.c:3198:raid_bdev_configure_base_bdev: *DEBUG*: bdev BaseBdev1 is claimed 00:28:01.648 [2024-06-10 16:07:07.123280] bdev_raid.c:3562:raid_bdev_examine_sb: *DEBUG*: raid superblock seq_number on bdev BaseBdev1 (1) smaller than existing raid bdev raid_bdev1 (5) 00:28:01.648 [2024-06-10 16:07:07.123294] bdev_raid.c:3581:raid_bdev_examine_sb: *DEBUG*: raid superblock does not contain this bdev's uuid 00:28:01.648 request: 00:28:01.648 { 00:28:01.648 "raid_bdev": "raid_bdev1", 00:28:01.648 "base_bdev": "BaseBdev1", 00:28:01.648 "method": "bdev_raid_add_base_bdev", 00:28:01.648 "req_id": 1 00:28:01.648 } 00:28:01.648 Got JSON-RPC error response 00:28:01.648 response: 00:28:01.648 { 00:28:01.648 "code": -22, 00:28:01.648 "message": "Failed to add base bdev to RAID bdev: Invalid argument" 00:28:01.648 } 00:28:01.648 16:07:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@652 -- # es=1 00:28:01.648 16:07:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@660 -- # (( es > 128 )) 00:28:01.648 16:07:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@671 -- # [[ -n '' ]] 00:28:01.648 16:07:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@676 -- # (( !es == 0 )) 00:28:01.648 16:07:07 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@777 -- # sleep 1 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@778 -- # verify_raid_bdev_state raid_bdev1 online raid1 0 1 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@116 -- # local raid_bdev_name=raid_bdev1 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@117 -- # local expected_state=online 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@118 -- # local raid_level=raid1 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@119 -- # local strip_size=0 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@120 -- # local num_base_bdevs_operational=1 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@121 -- # local raid_bdev_info 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@122 -- # local num_base_bdevs 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@123 -- # local num_base_bdevs_discovered 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@124 -- # local tmp 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@126 -- # raid_bdev_info='{ 00:28:03.026 "name": "raid_bdev1", 00:28:03.026 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:28:03.026 "strip_size_kb": 0, 00:28:03.026 "state": "online", 00:28:03.026 "raid_level": "raid1", 00:28:03.026 "superblock": true, 00:28:03.026 "num_base_bdevs": 2, 00:28:03.026 "num_base_bdevs_discovered": 1, 00:28:03.026 "num_base_bdevs_operational": 1, 00:28:03.026 "base_bdevs_list": [ 00:28:03.026 { 00:28:03.026 "name": null, 00:28:03.026 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.026 "is_configured": false, 00:28:03.026 "data_offset": 256, 00:28:03.026 "data_size": 7936 00:28:03.026 }, 00:28:03.026 { 00:28:03.026 "name": "BaseBdev2", 00:28:03.026 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:28:03.026 "is_configured": true, 00:28:03.026 "data_offset": 256, 00:28:03.026 "data_size": 7936 00:28:03.026 } 00:28:03.026 ] 00:28:03.026 }' 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@128 -- # xtrace_disable 00:28:03.026 16:07:08 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@779 -- # verify_raid_bdev_process raid_bdev1 none none 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@182 -- # local raid_bdev_name=raid_bdev1 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@183 -- # local process_type=none 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@184 -- # local target=none 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@185 -- # local raid_bdev_info 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-raid.sock bdev_raid_get_bdevs all 00:28:03.594 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # jq -r '.[] | select(.name == "raid_bdev1")' 00:28:03.853 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@187 -- # raid_bdev_info='{ 00:28:03.853 "name": "raid_bdev1", 00:28:03.853 "uuid": "fdfecd1b-fa9f-4787-bbc9-8ce31d3b5814", 00:28:03.853 "strip_size_kb": 0, 00:28:03.853 "state": "online", 00:28:03.853 "raid_level": "raid1", 00:28:03.853 "superblock": true, 00:28:03.853 "num_base_bdevs": 2, 00:28:03.853 "num_base_bdevs_discovered": 1, 00:28:03.853 "num_base_bdevs_operational": 1, 00:28:03.853 "base_bdevs_list": [ 00:28:03.853 { 00:28:03.853 "name": null, 00:28:03.853 "uuid": "00000000-0000-0000-0000-000000000000", 00:28:03.853 "is_configured": false, 00:28:03.853 "data_offset": 256, 00:28:03.853 "data_size": 7936 00:28:03.853 }, 00:28:03.853 { 00:28:03.853 "name": "BaseBdev2", 00:28:03.853 "uuid": "1384e27d-add8-5e55-b4b1-8503165172f5", 00:28:03.853 "is_configured": true, 00:28:03.853 "data_offset": 256, 00:28:03.853 "data_size": 7936 00:28:03.853 } 00:28:03.853 ] 00:28:03.853 }' 00:28:03.853 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # jq -r '.process.type // "none"' 00:28:03.853 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@189 -- # [[ none == \n\o\n\e ]] 00:28:03.853 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # jq -r '.process.target // "none"' 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@190 -- # [[ none == \n\o\n\e ]] 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@782 -- # killprocess 2828271 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@949 -- # '[' -z 2828271 ']' 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@953 -- # kill -0 2828271 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # uname 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2828271 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2828271' 00:28:04.112 killing process with pid 2828271 00:28:04.112 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@968 -- # kill 2828271 00:28:04.112 Received shutdown signal, test time was about 60.000000 seconds 00:28:04.112 00:28:04.112 Latency(us) 00:28:04.112 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:04.112 =================================================================================================================== 00:28:04.112 Total : 0.00 0.00 0.00 0.00 0.00 18446744073709551616.00 0.00 00:28:04.113 [2024-06-10 16:07:09.436068] bdev_raid.c:1358:raid_bdev_fini_start: *DEBUG*: raid_bdev_fini_start 00:28:04.113 [2024-06-10 16:07:09.436156] bdev_raid.c: 474:_raid_bdev_destruct: *DEBUG*: raid_bdev_destruct 00:28:04.113 [2024-06-10 16:07:09.436199] bdev_raid.c: 451:raid_bdev_io_device_unregister_cb: *DEBUG*: raid bdev base bdevs is 0, going to free all in destruct 00:28:04.113 [2024-06-10 16:07:09.436209] bdev_raid.c: 366:raid_bdev_cleanup: *DEBUG*: raid_bdev_cleanup, 0xe7d2a0 name raid_bdev1, state offline 00:28:04.113 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@973 -- # wait 2828271 00:28:04.113 [2024-06-10 16:07:09.462089] bdev_raid.c:1375:raid_bdev_exit: *DEBUG*: raid_bdev_exit 00:28:04.372 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- bdev/bdev_raid.sh@784 -- # return 0 00:28:04.372 00:28:04.372 real 0m29.908s 00:28:04.372 user 0m48.696s 00:28:04.372 sys 0m3.145s 00:28:04.372 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:04.372 16:07:09 bdev_raid.raid_rebuild_test_sb_md_interleaved -- common/autotest_common.sh@10 -- # set +x 00:28:04.372 ************************************ 00:28:04.372 END TEST raid_rebuild_test_sb_md_interleaved 00:28:04.372 ************************************ 00:28:04.372 16:07:09 bdev_raid -- bdev/bdev_raid.sh@916 -- # trap - EXIT 00:28:04.372 16:07:09 bdev_raid -- bdev/bdev_raid.sh@917 -- # cleanup 00:28:04.372 16:07:09 bdev_raid -- bdev/bdev_raid.sh@58 -- # '[' -n 2828271 ']' 00:28:04.372 16:07:09 bdev_raid -- bdev/bdev_raid.sh@58 -- # ps -p 2828271 00:28:04.372 16:07:09 bdev_raid -- bdev/bdev_raid.sh@62 -- # rm -rf /raidtest 00:28:04.372 00:28:04.372 real 18m40.785s 00:28:04.372 user 32m26.947s 00:28:04.372 sys 2m40.150s 00:28:04.372 16:07:09 bdev_raid -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:04.372 16:07:09 bdev_raid -- common/autotest_common.sh@10 -- # set +x 00:28:04.372 ************************************ 00:28:04.372 END TEST bdev_raid 00:28:04.372 ************************************ 00:28:04.372 16:07:09 -- spdk/autotest.sh@191 -- # run_test bdevperf_config /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:04.372 16:07:09 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:28:04.372 16:07:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:04.372 16:07:09 -- common/autotest_common.sh@10 -- # set +x 00:28:04.372 ************************************ 00:28:04.372 START TEST bdevperf_config 00:28:04.372 ************************************ 00:28:04.372 16:07:09 bdevperf_config -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test_config.sh 00:28:04.632 * Looking for test storage... 00:28:04.632 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/common.sh 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@5 -- # bdevperf=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@12 -- # jsonconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@13 -- # testconf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@15 -- # trap 'cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@17 -- # create_job global read Malloc0 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=read 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:04.632 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@18 -- # create_job job0 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:04.632 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@19 -- # create_job job1 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:04.632 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@20 -- # create_job job2 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:04.632 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@21 -- # create_job job3 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:04.632 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:04.632 16:07:09 bdevperf_config -- bdevperf/test_config.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:07.168 16:07:12 bdevperf_config -- bdevperf/test_config.sh@22 -- # bdevperf_output='[2024-06-10 16:07:09.980943] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:07.168 [2024-06-10 16:07:09.981021] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2833618 ] 00:28:07.168 Using job config with 4 jobs 00:28:07.168 [2024-06-10 16:07:10.095366] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:07.168 [2024-06-10 16:07:10.208041] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:07.168 cpumask for '\''job0'\'' is too big 00:28:07.168 cpumask for '\''job1'\'' is too big 00:28:07.168 cpumask for '\''job2'\'' is too big 00:28:07.168 cpumask for '\''job3'\'' is too big 00:28:07.168 Running I/O for 2 seconds... 00:28:07.168 00:28:07.168 Latency(us) 00:28:07.168 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:07.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.168 Malloc0 : 2.02 22214.15 21.69 0.00 0.00 11507.87 2012.89 17601.10 00:28:07.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.168 Malloc0 : 2.02 22192.16 21.67 0.00 0.00 11490.68 1989.49 15603.81 00:28:07.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.168 Malloc0 : 2.02 22170.13 21.65 0.00 0.00 11474.36 1997.29 13606.52 00:28:07.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.168 Malloc0 : 2.03 22242.75 21.72 0.00 0.00 11408.73 1006.45 11796.48 00:28:07.168 =================================================================================================================== 00:28:07.168 Total : 88819.18 86.74 0.00 0.00 11470.32 1006.45 17601.10' 00:28:07.168 16:07:12 bdevperf_config -- bdevperf/test_config.sh@23 -- # get_num_jobs '[2024-06-10 16:07:09.980943] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:07.168 [2024-06-10 16:07:09.981021] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2833618 ] 00:28:07.168 Using job config with 4 jobs 00:28:07.168 [2024-06-10 16:07:10.095366] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:07.168 [2024-06-10 16:07:10.208041] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:07.168 cpumask for '\''job0'\'' is too big 00:28:07.168 cpumask for '\''job1'\'' is too big 00:28:07.168 cpumask for '\''job2'\'' is too big 00:28:07.168 cpumask for '\''job3'\'' is too big 00:28:07.168 Running I/O for 2 seconds... 00:28:07.168 00:28:07.168 Latency(us) 00:28:07.168 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:07.168 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22214.15 21.69 0.00 0.00 11507.87 2012.89 17601.10 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22192.16 21.67 0.00 0.00 11490.68 1989.49 15603.81 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22170.13 21.65 0.00 0.00 11474.36 1997.29 13606.52 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.03 22242.75 21.72 0.00 0.00 11408.73 1006.45 11796.48 00:28:07.169 =================================================================================================================== 00:28:07.169 Total : 88819.18 86.74 0.00 0.00 11470.32 1006.45 17601.10' 00:28:07.169 16:07:12 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:07.169 16:07:12 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 16:07:09.980943] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:07.169 [2024-06-10 16:07:09.981021] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2833618 ] 00:28:07.169 Using job config with 4 jobs 00:28:07.169 [2024-06-10 16:07:10.095366] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:07.169 [2024-06-10 16:07:10.208041] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:07.169 cpumask for '\''job0'\'' is too big 00:28:07.169 cpumask for '\''job1'\'' is too big 00:28:07.169 cpumask for '\''job2'\'' is too big 00:28:07.169 cpumask for '\''job3'\'' is too big 00:28:07.169 Running I/O for 2 seconds... 00:28:07.169 00:28:07.169 Latency(us) 00:28:07.169 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22214.15 21.69 0.00 0.00 11507.87 2012.89 17601.10 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22192.16 21.67 0.00 0.00 11490.68 1989.49 15603.81 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.02 22170.13 21.65 0.00 0.00 11474.36 1997.29 13606.52 00:28:07.169 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:07.169 Malloc0 : 2.03 22242.75 21.72 0.00 0.00 11408.73 1006.45 11796.48 00:28:07.169 =================================================================================================================== 00:28:07.169 Total : 88819.18 86.74 0.00 0.00 11470.32 1006.45 17601.10' 00:28:07.169 16:07:12 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:07.169 16:07:12 bdevperf_config -- bdevperf/test_config.sh@23 -- # [[ 4 == \4 ]] 00:28:07.169 16:07:12 bdevperf_config -- bdevperf/test_config.sh@25 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -C -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:07.169 [2024-06-10 16:07:12.668109] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:07.169 [2024-06-10 16:07:12.668169] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834001 ] 00:28:07.428 [2024-06-10 16:07:12.784200] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:07.428 [2024-06-10 16:07:12.894332] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:07.687 cpumask for 'job0' is too big 00:28:07.687 cpumask for 'job1' is too big 00:28:07.687 cpumask for 'job2' is too big 00:28:07.687 cpumask for 'job3' is too big 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@25 -- # bdevperf_output='Using job config with 4 jobs 00:28:10.222 Running I/O for 2 seconds... 00:28:10.222 00:28:10.222 Latency(us) 00:28:10.222 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:10.222 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:10.222 Malloc0 : 2.02 22301.35 21.78 0.00 0.00 11460.37 1997.29 17601.10 00:28:10.222 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:10.222 Malloc0 : 2.02 22279.40 21.76 0.00 0.00 11442.63 1989.49 15541.39 00:28:10.222 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:10.222 Malloc0 : 2.02 22257.41 21.74 0.00 0.00 11426.89 1997.29 13544.11 00:28:10.222 Job: Malloc0 (Core Mask 0xff, workload: read, depth: 256, IO size: 1024) 00:28:10.222 Malloc0 : 2.03 22235.38 21.71 0.00 0.00 11410.95 1989.49 11796.48 00:28:10.222 =================================================================================================================== 00:28:10.222 Total : 89073.53 86.99 0.00 0.00 11435.21 1989.49 17601.10' 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@27 -- # cleanup 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@29 -- # create_job job0 write Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:10.222 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@30 -- # create_job job1 write Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:10.222 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@31 -- # create_job job2 write Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=write 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:10.222 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:10.222 16:07:15 bdevperf_config -- bdevperf/test_config.sh@32 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:12.758 16:07:17 bdevperf_config -- bdevperf/test_config.sh@32 -- # bdevperf_output='[2024-06-10 16:07:15.372602] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:12.758 [2024-06-10 16:07:15.372662] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834420 ] 00:28:12.758 Using job config with 3 jobs 00:28:12.758 [2024-06-10 16:07:15.490502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:12.758 [2024-06-10 16:07:15.606837] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:12.758 cpumask for '\''job0'\'' is too big 00:28:12.758 cpumask for '\''job1'\'' is too big 00:28:12.758 cpumask for '\''job2'\'' is too big 00:28:12.758 Running I/O for 2 seconds... 00:28:12.758 00:28:12.758 Latency(us) 00:28:12.758 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:12.758 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.758 Malloc0 : 2.01 30127.10 29.42 0.00 0.00 8490.46 1958.28 12483.05 00:28:12.758 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.758 Malloc0 : 2.02 30096.94 29.39 0.00 0.00 8478.90 1950.48 10548.18 00:28:12.758 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.758 Malloc0 : 2.02 30067.02 29.36 0.00 0.00 8466.42 1927.07 8800.55 00:28:12.758 =================================================================================================================== 00:28:12.758 Total : 90291.06 88.17 0.00 0.00 8478.59 1927.07 12483.05' 00:28:12.758 16:07:17 bdevperf_config -- bdevperf/test_config.sh@33 -- # get_num_jobs '[2024-06-10 16:07:15.372602] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:12.759 [2024-06-10 16:07:15.372662] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834420 ] 00:28:12.759 Using job config with 3 jobs 00:28:12.759 [2024-06-10 16:07:15.490502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:12.759 [2024-06-10 16:07:15.606837] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:12.759 cpumask for '\''job0'\'' is too big 00:28:12.759 cpumask for '\''job1'\'' is too big 00:28:12.759 cpumask for '\''job2'\'' is too big 00:28:12.759 Running I/O for 2 seconds... 00:28:12.759 00:28:12.759 Latency(us) 00:28:12.759 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.01 30127.10 29.42 0.00 0.00 8490.46 1958.28 12483.05 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.02 30096.94 29.39 0.00 0.00 8478.90 1950.48 10548.18 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.02 30067.02 29.36 0.00 0.00 8466.42 1927.07 8800.55 00:28:12.759 =================================================================================================================== 00:28:12.759 Total : 90291.06 88.17 0.00 0.00 8478.59 1927.07 12483.05' 00:28:12.759 16:07:17 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 16:07:15.372602] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:12.759 [2024-06-10 16:07:15.372662] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834420 ] 00:28:12.759 Using job config with 3 jobs 00:28:12.759 [2024-06-10 16:07:15.490502] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:12.759 [2024-06-10 16:07:15.606837] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:12.759 cpumask for '\''job0'\'' is too big 00:28:12.759 cpumask for '\''job1'\'' is too big 00:28:12.759 cpumask for '\''job2'\'' is too big 00:28:12.759 Running I/O for 2 seconds... 00:28:12.759 00:28:12.759 Latency(us) 00:28:12.759 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.01 30127.10 29.42 0.00 0.00 8490.46 1958.28 12483.05 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.02 30096.94 29.39 0.00 0.00 8478.90 1950.48 10548.18 00:28:12.759 Job: Malloc0 (Core Mask 0xff, workload: write, depth: 256, IO size: 1024) 00:28:12.759 Malloc0 : 2.02 30067.02 29.36 0.00 0.00 8466.42 1927.07 8800.55 00:28:12.759 =================================================================================================================== 00:28:12.759 Total : 90291.06 88.17 0.00 0.00 8478.59 1927.07 12483.05' 00:28:12.759 16:07:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:12.759 16:07:17 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@33 -- # [[ 3 == \3 ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@35 -- # cleanup 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@37 -- # create_job global rw Malloc0:Malloc1 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=global 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw=rw 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename=Malloc0:Malloc1 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ global == \g\l\o\b\a\l ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@13 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[global]' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:12.759 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@38 -- # create_job job0 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job0 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job0 == \g\l\o\b\a\l ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job0]' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:12.759 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@39 -- # create_job job1 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job1 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job1 == \g\l\o\b\a\l ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job1]' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:12.759 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@40 -- # create_job job2 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job2 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job2 == \g\l\o\b\a\l ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job2]' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:12.759 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@41 -- # create_job job3 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@8 -- # local job_section=job3 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@9 -- # local rw= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@10 -- # local filename= 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@12 -- # [[ job3 == \g\l\o\b\a\l ]] 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@18 -- # job='[job3]' 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@19 -- # echo 00:28:12.759 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/common.sh@20 -- # cat 00:28:12.759 16:07:18 bdevperf_config -- bdevperf/test_config.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 2 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/conf.json -j /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:15.294 16:07:20 bdevperf_config -- bdevperf/test_config.sh@42 -- # bdevperf_output='[2024-06-10 16:07:18.079527] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:15.294 [2024-06-10 16:07:18.079586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834875 ] 00:28:15.294 Using job config with 4 jobs 00:28:15.294 [2024-06-10 16:07:18.190378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:15.294 [2024-06-10 16:07:18.306758] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.294 cpumask for '\''job0'\'' is too big 00:28:15.294 cpumask for '\''job1'\'' is too big 00:28:15.294 cpumask for '\''job2'\'' is too big 00:28:15.294 cpumask for '\''job3'\'' is too big 00:28:15.294 Running I/O for 2 seconds... 00:28:15.294 00:28:15.294 Latency(us) 00:28:15.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.03 11112.84 10.85 0.00 0.00 23009.06 4088.20 35701.52 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.03 11101.21 10.84 0.00 0.00 23008.99 4993.22 35701.52 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11120.94 10.86 0.00 0.00 22881.35 4088.20 31457.28 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.05 11109.48 10.85 0.00 0.00 22880.57 4993.22 31457.28 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11098.28 10.84 0.00 0.00 22815.00 4056.99 27337.87 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11086.79 10.83 0.00 0.00 22814.64 4962.01 27337.87 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.06 11075.67 10.82 0.00 0.00 22751.87 4088.20 23468.13 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11064.29 10.80 0.00 0.00 22751.30 4962.01 23468.13 00:28:15.295 =================================================================================================================== 00:28:15.295 Total : 88769.50 86.69 0.00 0.00 22863.69 4056.99 35701.52' 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/test_config.sh@43 -- # get_num_jobs '[2024-06-10 16:07:18.079527] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:15.295 [2024-06-10 16:07:18.079586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834875 ] 00:28:15.295 Using job config with 4 jobs 00:28:15.295 [2024-06-10 16:07:18.190378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:15.295 [2024-06-10 16:07:18.306758] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.295 cpumask for '\''job0'\'' is too big 00:28:15.295 cpumask for '\''job1'\'' is too big 00:28:15.295 cpumask for '\''job2'\'' is too big 00:28:15.295 cpumask for '\''job3'\'' is too big 00:28:15.295 Running I/O for 2 seconds... 00:28:15.295 00:28:15.295 Latency(us) 00:28:15.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.03 11112.84 10.85 0.00 0.00 23009.06 4088.20 35701.52 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.03 11101.21 10.84 0.00 0.00 23008.99 4993.22 35701.52 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11120.94 10.86 0.00 0.00 22881.35 4088.20 31457.28 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.05 11109.48 10.85 0.00 0.00 22880.57 4993.22 31457.28 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11098.28 10.84 0.00 0.00 22815.00 4056.99 27337.87 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11086.79 10.83 0.00 0.00 22814.64 4962.01 27337.87 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.06 11075.67 10.82 0.00 0.00 22751.87 4088.20 23468.13 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11064.29 10.80 0.00 0.00 22751.30 4962.01 23468.13 00:28:15.295 =================================================================================================================== 00:28:15.295 Total : 88769.50 86.69 0.00 0.00 22863.69 4056.99 35701.52' 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/common.sh@32 -- # echo '[2024-06-10 16:07:18.079527] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:15.295 [2024-06-10 16:07:18.079586] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2834875 ] 00:28:15.295 Using job config with 4 jobs 00:28:15.295 [2024-06-10 16:07:18.190378] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:28:15.295 [2024-06-10 16:07:18.306758] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:15.295 cpumask for '\''job0'\'' is too big 00:28:15.295 cpumask for '\''job1'\'' is too big 00:28:15.295 cpumask for '\''job2'\'' is too big 00:28:15.295 cpumask for '\''job3'\'' is too big 00:28:15.295 Running I/O for 2 seconds... 00:28:15.295 00:28:15.295 Latency(us) 00:28:15.295 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.03 11112.84 10.85 0.00 0.00 23009.06 4088.20 35701.52 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.03 11101.21 10.84 0.00 0.00 23008.99 4993.22 35701.52 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11120.94 10.86 0.00 0.00 22881.35 4088.20 31457.28 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.05 11109.48 10.85 0.00 0.00 22880.57 4993.22 31457.28 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.05 11098.28 10.84 0.00 0.00 22815.00 4056.99 27337.87 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11086.79 10.83 0.00 0.00 22814.64 4962.01 27337.87 00:28:15.295 Job: Malloc0 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc0 : 2.06 11075.67 10.82 0.00 0.00 22751.87 4088.20 23468.13 00:28:15.295 Job: Malloc1 (Core Mask 0xff, workload: rw, percentage: 70, depth: 256, IO size: 1024) 00:28:15.295 Malloc1 : 2.06 11064.29 10.80 0.00 0.00 22751.30 4962.01 23468.13 00:28:15.295 =================================================================================================================== 00:28:15.295 Total : 88769.50 86.69 0.00 0.00 22863.69 4056.99 35701.52' 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE 'Using job config with [0-9]+ jobs' 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/common.sh@32 -- # grep -oE '[0-9]+' 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/test_config.sh@43 -- # [[ 4 == \4 ]] 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/test_config.sh@44 -- # cleanup 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/common.sh@36 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevperf/test.conf 00:28:15.295 16:07:20 bdevperf_config -- bdevperf/test_config.sh@45 -- # trap - SIGINT SIGTERM EXIT 00:28:15.295 00:28:15.295 real 0m10.961s 00:28:15.295 user 0m9.819s 00:28:15.295 sys 0m0.970s 00:28:15.295 16:07:20 bdevperf_config -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:15.295 16:07:20 bdevperf_config -- common/autotest_common.sh@10 -- # set +x 00:28:15.295 ************************************ 00:28:15.295 END TEST bdevperf_config 00:28:15.295 ************************************ 00:28:15.295 16:07:20 -- spdk/autotest.sh@192 -- # uname -s 00:28:15.558 16:07:20 -- spdk/autotest.sh@192 -- # [[ Linux == Linux ]] 00:28:15.558 16:07:20 -- spdk/autotest.sh@193 -- # run_test reactor_set_interrupt /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:15.558 16:07:20 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:28:15.558 16:07:20 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:15.558 16:07:20 -- common/autotest_common.sh@10 -- # set +x 00:28:15.558 ************************************ 00:28:15.558 START TEST reactor_set_interrupt 00:28:15.558 ************************************ 00:28:15.558 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:15.559 * Looking for test storage... 00:28:15.559 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reactor_set_interrupt.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:15.559 16:07:20 reactor_set_interrupt -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@34 -- # set -e 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:15.559 16:07:20 reactor_set_interrupt -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:15.559 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:15.559 16:07:20 reactor_set_interrupt -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:15.560 #define SPDK_CONFIG_H 00:28:15.560 #define SPDK_CONFIG_APPS 1 00:28:15.560 #define SPDK_CONFIG_ARCH native 00:28:15.560 #undef SPDK_CONFIG_ASAN 00:28:15.560 #undef SPDK_CONFIG_AVAHI 00:28:15.560 #undef SPDK_CONFIG_CET 00:28:15.560 #define SPDK_CONFIG_COVERAGE 1 00:28:15.560 #define SPDK_CONFIG_CROSS_PREFIX 00:28:15.560 #define SPDK_CONFIG_CRYPTO 1 00:28:15.560 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:15.560 #undef SPDK_CONFIG_CUSTOMOCF 00:28:15.560 #undef SPDK_CONFIG_DAOS 00:28:15.560 #define SPDK_CONFIG_DAOS_DIR 00:28:15.560 #define SPDK_CONFIG_DEBUG 1 00:28:15.560 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:15.560 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:15.560 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:15.560 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:15.560 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:15.560 #undef SPDK_CONFIG_DPDK_UADK 00:28:15.560 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:15.560 #define SPDK_CONFIG_EXAMPLES 1 00:28:15.560 #undef SPDK_CONFIG_FC 00:28:15.560 #define SPDK_CONFIG_FC_PATH 00:28:15.560 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:15.560 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:15.560 #undef SPDK_CONFIG_FUSE 00:28:15.560 #undef SPDK_CONFIG_FUZZER 00:28:15.560 #define SPDK_CONFIG_FUZZER_LIB 00:28:15.560 #undef SPDK_CONFIG_GOLANG 00:28:15.560 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:15.560 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:15.560 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:15.560 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:15.560 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:15.560 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:15.560 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:15.560 #define SPDK_CONFIG_IDXD 1 00:28:15.560 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:15.560 #define SPDK_CONFIG_IPSEC_MB 1 00:28:15.560 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:15.560 #define SPDK_CONFIG_ISAL 1 00:28:15.560 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:15.560 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:15.560 #define SPDK_CONFIG_LIBDIR 00:28:15.560 #undef SPDK_CONFIG_LTO 00:28:15.560 #define SPDK_CONFIG_MAX_LCORES 00:28:15.560 #define SPDK_CONFIG_NVME_CUSE 1 00:28:15.560 #undef SPDK_CONFIG_OCF 00:28:15.560 #define SPDK_CONFIG_OCF_PATH 00:28:15.560 #define SPDK_CONFIG_OPENSSL_PATH 00:28:15.560 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:15.560 #define SPDK_CONFIG_PGO_DIR 00:28:15.560 #undef SPDK_CONFIG_PGO_USE 00:28:15.560 #define SPDK_CONFIG_PREFIX /usr/local 00:28:15.560 #undef SPDK_CONFIG_RAID5F 00:28:15.560 #undef SPDK_CONFIG_RBD 00:28:15.560 #define SPDK_CONFIG_RDMA 1 00:28:15.560 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:15.560 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:15.560 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:15.560 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:15.560 #define SPDK_CONFIG_SHARED 1 00:28:15.560 #undef SPDK_CONFIG_SMA 00:28:15.560 #define SPDK_CONFIG_TESTS 1 00:28:15.560 #undef SPDK_CONFIG_TSAN 00:28:15.560 #define SPDK_CONFIG_UBLK 1 00:28:15.560 #define SPDK_CONFIG_UBSAN 1 00:28:15.560 #undef SPDK_CONFIG_UNIT_TESTS 00:28:15.560 #undef SPDK_CONFIG_URING 00:28:15.560 #define SPDK_CONFIG_URING_PATH 00:28:15.560 #undef SPDK_CONFIG_URING_ZNS 00:28:15.560 #undef SPDK_CONFIG_USDT 00:28:15.560 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:15.560 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:15.560 #undef SPDK_CONFIG_VFIO_USER 00:28:15.560 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:15.560 #define SPDK_CONFIG_VHOST 1 00:28:15.560 #define SPDK_CONFIG_VIRTIO 1 00:28:15.560 #undef SPDK_CONFIG_VTUNE 00:28:15.560 #define SPDK_CONFIG_VTUNE_DIR 00:28:15.560 #define SPDK_CONFIG_WERROR 1 00:28:15.560 #define SPDK_CONFIG_WPDK_DIR 00:28:15.560 #undef SPDK_CONFIG_XNVME 00:28:15.560 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:15.560 16:07:20 reactor_set_interrupt -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:15.560 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:15.560 16:07:20 reactor_set_interrupt -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:15.560 16:07:20 reactor_set_interrupt -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:15.560 16:07:20 reactor_set_interrupt -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:15.560 16:07:20 reactor_set_interrupt -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:15.560 16:07:20 reactor_set_interrupt -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:15.560 16:07:20 reactor_set_interrupt -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:15.560 16:07:20 reactor_set_interrupt -- paths/export.sh@5 -- # export PATH 00:28:15.560 16:07:20 reactor_set_interrupt -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:15.560 16:07:20 reactor_set_interrupt -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@64 -- # TEST_TAG=N/A 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@68 -- # uname -s 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@68 -- # PM_OS=Linux 00:28:15.560 16:07:20 reactor_set_interrupt -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@76 -- # SUDO[0]= 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:15.560 16:07:21 reactor_set_interrupt -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@58 -- # : 0 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@62 -- # : 0 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@64 -- # : 0 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@66 -- # : 1 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@68 -- # : 0 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@70 -- # : 00:28:15.560 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@72 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@74 -- # : 1 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@76 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@78 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@80 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@82 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@84 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@86 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@88 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@90 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@92 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@94 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@96 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@98 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@100 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@102 -- # : rdma 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@104 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@106 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@108 -- # : 1 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@110 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@112 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@114 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@116 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@118 -- # : 1 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@120 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@122 -- # : 1 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@124 -- # : 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@126 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@128 -- # : 1 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@130 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@132 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@134 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@136 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@138 -- # : 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@140 -- # : true 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@142 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@144 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@146 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@148 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@150 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@152 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@154 -- # : 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@156 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@158 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@160 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@162 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@164 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@167 -- # : 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@169 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@171 -- # : 0 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:15.561 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@200 -- # cat 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@263 -- # export valgrind= 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@263 -- # valgrind= 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@269 -- # uname -s 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@279 -- # MAKE=make 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@299 -- # TEST_MODE= 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@318 -- # [[ -z 2835367 ]] 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@318 -- # kill -0 2835367 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@331 -- # local mount target_dir 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.4bfXK7 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.4bfXK7/tests/interrupt /tmp/spdk.4bfXK7 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@327 -- # df -T 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=900243456 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4384186368 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=89242169344 00:28:15.562 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562715136 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=6320545792 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47777980416 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781355520 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=19102953472 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112546304 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=9592832 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=47780691968 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781359616 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=667648 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # avails["$mount"]=9556267008 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556271104 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:28:15.855 * Looking for test storage... 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@368 -- # local target_space new_size 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@372 -- # mount=/ 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@374 -- # target_space=89242169344 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@381 -- # new_size=8535138304 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.855 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@389 -- # return 0 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1681 -- # set -o errtrace 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1686 -- # true 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@1688 -- # xtrace_fd 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@27 -- # exec 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@29 -- # exec 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:15.855 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@18 -- # set -x 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:15.855 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@11 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@86 -- # start_intr_tgt 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2835414 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:15.856 16:07:21 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2835414 /var/tmp/spdk.sock 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 2835414 ']' 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:15.856 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:15.856 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:15.856 [2024-06-10 16:07:21.126052] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:15.856 [2024-06-10 16:07:21.126094] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2835414 ] 00:28:15.856 [2024-06-10 16:07:21.211139] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:15.856 [2024-06-10 16:07:21.306869] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:15.856 [2024-06-10 16:07:21.306974] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:15.856 [2024-06-10 16:07:21.306983] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:16.115 [2024-06-10 16:07:21.377389] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:16.115 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:16.115 16:07:21 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:28:16.115 16:07:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@87 -- # setup_bdev_mem 00:28:16.115 16:07:21 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:16.373 Malloc0 00:28:16.373 Malloc1 00:28:16.373 Malloc2 00:28:16.373 16:07:21 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@88 -- # setup_bdev_aio 00:28:16.373 16:07:21 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:16.373 16:07:21 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:16.374 16:07:21 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:16.374 5000+0 records in 00:28:16.374 5000+0 records out 00:28:16.374 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0171433 s, 597 MB/s 00:28:16.374 16:07:21 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:16.632 AIO0 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@90 -- # reactor_set_mode_without_threads 2835414 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@76 -- # reactor_set_intr_mode 2835414 without_thd 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2835414 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd=without_thd 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:16.632 16:07:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:16.633 16:07:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:16.633 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:16.633 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:16.891 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:17.150 spdk_thread ids are 1 on reactor0. 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2835414 0 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2835414 0 idle 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:17.150 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835414 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.32 reactor_0' 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835414 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.32 reactor_0 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2835414 1 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2835414 1 idle 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835443 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835443 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:28:17.409 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2835414 2 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2835414 2 idle 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:17.410 16:07:22 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835444 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835444 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' without_thdx '!=' x ']' 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@35 -- # for i in "${thd0_ids[@]}" 00:28:17.669 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@36 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x2 00:28:17.928 [2024-06-10 16:07:23.287731] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:17.928 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:18.186 [2024-06-10 16:07:23.539513] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:18.186 [2024-06-10 16:07:23.539895] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:18.186 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:18.444 [2024-06-10 16:07:23.791428] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:18.444 [2024-06-10 16:07:23.791551] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2835414 0 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2835414 0 busy 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:18.444 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835414 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.76 reactor_0' 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835414 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.76 reactor_0 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2835414 2 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2835414 2 busy 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:18.703 16:07:23 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835444 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.36 reactor_2' 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835444 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.36 reactor_2 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:18.703 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:18.962 [2024-06-10 16:07:24.403422] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:18.962 [2024-06-10 16:07:24.403519] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' without_thdx '!=' x ']' 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2835414 2 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2835414 2 idle 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:18.962 16:07:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835444 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.60 reactor_2' 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835444 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.60 reactor_2 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:19.221 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:19.480 [2024-06-10 16:07:24.831416] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:19.480 [2024-06-10 16:07:24.831531] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:19.480 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' without_thdx '!=' x ']' 00:28:19.480 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@65 -- # for i in "${thd0_ids[@]}" 00:28:19.480 16:07:24 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@66 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_set_cpumask -i 1 -m 0x1 00:28:19.739 [2024-06-10 16:07:25.087724] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2835414 0 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2835414 0 idle 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2835414 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2835414 -w 256 00:28:19.739 16:07:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2835414 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.61 reactor_0' 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2835414 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.61 reactor_0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@77 -- # return 0 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@92 -- # trap - SIGINT SIGTERM EXIT 00:28:19.998 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@93 -- # killprocess 2835414 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 2835414 ']' 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 2835414 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2835414 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2835414' 00:28:19.998 killing process with pid 2835414 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 2835414 00:28:19.998 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 2835414 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@94 -- # cleanup 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@97 -- # start_intr_tgt 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2836269 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:20.257 16:07:25 reactor_set_interrupt -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2836269 /var/tmp/spdk.sock 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@830 -- # '[' -z 2836269 ']' 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:20.257 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:20.257 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:20.257 [2024-06-10 16:07:25.583451] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:20.257 [2024-06-10 16:07:25.583508] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2836269 ] 00:28:20.257 [2024-06-10 16:07:25.683756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:20.516 [2024-06-10 16:07:25.775797] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:20.516 [2024-06-10 16:07:25.775815] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:20.516 [2024-06-10 16:07:25.775819] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:20.516 [2024-06-10 16:07:25.847421] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:20.516 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:20.516 16:07:25 reactor_set_interrupt -- common/autotest_common.sh@863 -- # return 0 00:28:20.516 16:07:25 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@98 -- # setup_bdev_mem 00:28:20.516 16:07:25 reactor_set_interrupt -- interrupt/common.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:20.775 Malloc0 00:28:20.775 Malloc1 00:28:20.775 Malloc2 00:28:20.775 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@99 -- # setup_bdev_aio 00:28:20.775 16:07:26 reactor_set_interrupt -- interrupt/common.sh@75 -- # uname -s 00:28:20.775 16:07:26 reactor_set_interrupt -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:20.775 16:07:26 reactor_set_interrupt -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:20.775 5000+0 records in 00:28:20.775 5000+0 records out 00:28:20.775 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0125168 s, 818 MB/s 00:28:20.775 16:07:26 reactor_set_interrupt -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:21.035 AIO0 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@101 -- # reactor_set_mode_with_threads 2836269 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@81 -- # reactor_set_intr_mode 2836269 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@14 -- # local spdk_pid=2836269 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@15 -- # local without_thd= 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # thd0_ids=($(reactor_get_thread_ids $r0_mask)) 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@17 -- # reactor_get_thread_ids 0x1 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x1 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=1 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:21.035 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 1 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo 1 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # thd2_ids=($(reactor_get_thread_ids $r2_mask)) 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@18 -- # reactor_get_thread_ids 0x4 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@55 -- # local reactor_cpumask=0x4 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@56 -- # local grep_str 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@58 -- # reactor_cpumask=4 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@59 -- # jq_str='.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py thread_get_stats 00:28:21.294 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # jq --arg reactor_cpumask 4 '.threads|.[]|select(.cpumask == $reactor_cpumask)|.id' 00:28:21.553 16:07:26 reactor_set_interrupt -- interrupt/common.sh@62 -- # echo '' 00:28:21.553 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@21 -- # [[ 1 -eq 0 ]] 00:28:21.553 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@25 -- # echo 'spdk_thread ids are 1 on reactor0.' 00:28:21.553 spdk_thread ids are 1 on reactor0. 00:28:21.553 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2836269 0 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2836269 0 idle 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:21.554 16:07:26 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:21.554 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:21.554 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836269 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.33 reactor_0' 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836269 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.33 reactor_0 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2836269 1 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2836269 1 idle 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=1 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:21.813 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_1 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836305 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836305 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_1 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@29 -- # for i in {0..2} 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@30 -- # reactor_is_idle 2836269 2 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2836269 2 idle 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836306 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836306 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.00 reactor_2 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@33 -- # '[' x '!=' x ']' 00:28:22.072 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@43 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 -d 00:28:22.332 [2024-06-10 16:07:27.764387] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 0. 00:28:22.332 [2024-06-10 16:07:27.764547] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to poll mode from intr mode. 00:28:22.332 [2024-06-10 16:07:27.764744] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:22.332 16:07:27 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 -d 00:28:22.591 [2024-06-10 16:07:28.024966] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to disable interrupt mode on reactor 2. 00:28:22.591 [2024-06-10 16:07:28.025130] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2836269 0 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2836269 0 busy 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:22.591 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836269 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.78 reactor_0' 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836269 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.78 reactor_0 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@46 -- # for i in 0 2 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@47 -- # reactor_is_busy 2836269 2 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@47 -- # reactor_is_busy_or_idle 2836269 2 busy 00:28:22.850 16:07:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=busy 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ busy != \b\u\s\y ]] 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:22.851 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836306 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2' 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836306 root 20 0 128.2g 36096 23040 R 99.9 0.0 0:00.35 reactor_2 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=99.9 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=99 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ busy = \b\u\s\y ]] 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ 99 -lt 70 ]] 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ busy = \i\d\l\e ]] 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 2 00:28:23.109 [2024-06-10 16:07:28.566542] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 2. 00:28:23.109 [2024-06-10 16:07:28.566862] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@52 -- # '[' x '!=' x ']' 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@59 -- # reactor_is_idle 2836269 2 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2836269 2 idle 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=2 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:23.109 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_2 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836306 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.54 reactor_2' 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836306 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:00.54 reactor_2 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:23.368 16:07:28 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@62 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py --plugin interrupt_plugin reactor_set_interrupt_mode 0 00:28:23.626 [2024-06-10 16:07:29.003670] interrupt_tgt.c: 99:rpc_reactor_set_interrupt_mode: *NOTICE*: RPC Start to enable interrupt mode on reactor 0. 00:28:23.626 [2024-06-10 16:07:29.003785] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from poll mode. 00:28:23.626 [2024-06-10 16:07:29.003805] interrupt_tgt.c: 36:rpc_reactor_set_interrupt_mode_cb: *NOTICE*: complete reactor switch 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@63 -- # '[' x '!=' x ']' 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@70 -- # reactor_is_idle 2836269 0 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@51 -- # reactor_is_busy_or_idle 2836269 0 idle 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@10 -- # local pid=2836269 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@11 -- # local idx=0 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@12 -- # local state=idle 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \b\u\s\y ]] 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@14 -- # [[ idle != \i\d\l\e ]] 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@18 -- # hash top 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j = 10 )) 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@23 -- # (( j != 0 )) 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top -bHn 1 -p 2836269 -w 256 00:28:23.626 16:07:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # grep reactor_0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@24 -- # top_reactor='2836269 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.57 reactor_0' 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # echo 2836269 root 20 0 128.2g 36096 23040 S 0.0 0.0 0:01.57 reactor_0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # sed -e 's/^\s*//g' 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # awk '{print $9}' 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@25 -- # cpu_rate=0.0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@26 -- # cpu_rate=0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@28 -- # [[ idle = \b\u\s\y ]] 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ idle = \i\d\l\e ]] 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@30 -- # [[ 0 -gt 30 ]] 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/common.sh@33 -- # return 0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@72 -- # return 0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@82 -- # return 0 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@103 -- # trap - SIGINT SIGTERM EXIT 00:28:23.884 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@104 -- # killprocess 2836269 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@949 -- # '[' -z 2836269 ']' 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@953 -- # kill -0 2836269 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@954 -- # uname 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2836269 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2836269' 00:28:23.884 killing process with pid 2836269 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@968 -- # kill 2836269 00:28:23.884 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@973 -- # wait 2836269 00:28:24.143 16:07:29 reactor_set_interrupt -- interrupt/reactor_set_interrupt.sh@105 -- # cleanup 00:28:24.143 16:07:29 reactor_set_interrupt -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:24.143 00:28:24.143 real 0m8.627s 00:28:24.143 user 0m9.230s 00:28:24.143 sys 0m1.611s 00:28:24.143 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:24.143 16:07:29 reactor_set_interrupt -- common/autotest_common.sh@10 -- # set +x 00:28:24.143 ************************************ 00:28:24.143 END TEST reactor_set_interrupt 00:28:24.143 ************************************ 00:28:24.143 16:07:29 -- spdk/autotest.sh@194 -- # run_test reap_unregistered_poller /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:24.143 16:07:29 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:28:24.143 16:07:29 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:24.143 16:07:29 -- common/autotest_common.sh@10 -- # set +x 00:28:24.143 ************************************ 00:28:24.143 START TEST reap_unregistered_poller 00:28:24.143 ************************************ 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:24.143 * Looking for test storage... 00:28:24.143 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@9 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/interrupt_common.sh 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/reap_unregistered_poller.sh 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@5 -- # testdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/../.. 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@6 -- # rootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:24.143 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@7 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@7 -- # rpc_py=rpc_cmd 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@34 -- # set -e 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@35 -- # shopt -s nullglob 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@36 -- # shopt -s extglob 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@37 -- # shopt -s inherit_errexit 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@39 -- # '[' -z /var/jenkins/workspace/crypto-phy-autotest/spdk/../output ']' 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@44 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh ]] 00:28:24.143 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/build_config.sh 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@1 -- # CONFIG_WPDK_DIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@2 -- # CONFIG_ASAN=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@3 -- # CONFIG_VBDEV_COMPRESS=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@4 -- # CONFIG_HAVE_EXECINFO_H=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@5 -- # CONFIG_USDT=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@6 -- # CONFIG_CUSTOMOCF=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@7 -- # CONFIG_PREFIX=/usr/local 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@8 -- # CONFIG_RBD=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@9 -- # CONFIG_LIBDIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@10 -- # CONFIG_IDXD=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@11 -- # CONFIG_NVME_CUSE=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@12 -- # CONFIG_SMA=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@13 -- # CONFIG_VTUNE=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@14 -- # CONFIG_TSAN=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@15 -- # CONFIG_RDMA_SEND_WITH_INVAL=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@16 -- # CONFIG_VFIO_USER_DIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@17 -- # CONFIG_PGO_CAPTURE=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@18 -- # CONFIG_HAVE_UUID_GENERATE_SHA1=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@19 -- # CONFIG_ENV=/var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@20 -- # CONFIG_LTO=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@21 -- # CONFIG_ISCSI_INITIATOR=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@22 -- # CONFIG_CET=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@23 -- # CONFIG_VBDEV_COMPRESS_MLX5=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@24 -- # CONFIG_OCF_PATH= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@25 -- # CONFIG_RDMA_SET_TOS=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@26 -- # CONFIG_HAVE_ARC4RANDOM=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@27 -- # CONFIG_HAVE_LIBARCHIVE=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@28 -- # CONFIG_UBLK=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@29 -- # CONFIG_ISAL_CRYPTO=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@30 -- # CONFIG_OPENSSL_PATH= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@31 -- # CONFIG_OCF=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@32 -- # CONFIG_FUSE=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@33 -- # CONFIG_VTUNE_DIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@34 -- # CONFIG_FUZZER_LIB= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@35 -- # CONFIG_FUZZER=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@36 -- # CONFIG_DPDK_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@37 -- # CONFIG_CRYPTO=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@38 -- # CONFIG_PGO_USE=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@39 -- # CONFIG_VHOST=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@40 -- # CONFIG_DAOS=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@41 -- # CONFIG_DPDK_INC_DIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@42 -- # CONFIG_DAOS_DIR= 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@43 -- # CONFIG_UNIT_TESTS=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@44 -- # CONFIG_RDMA_SET_ACK_TIMEOUT=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@45 -- # CONFIG_VIRTIO=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@46 -- # CONFIG_DPDK_UADK=n 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@47 -- # CONFIG_COVERAGE=y 00:28:24.143 16:07:29 reap_unregistered_poller -- common/build_config.sh@48 -- # CONFIG_RDMA=y 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@49 -- # CONFIG_FIO_SOURCE_DIR=/usr/src/fio 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@50 -- # CONFIG_URING_PATH= 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@51 -- # CONFIG_XNVME=n 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@52 -- # CONFIG_VFIO_USER=n 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@53 -- # CONFIG_ARCH=native 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@54 -- # CONFIG_HAVE_EVP_MAC=y 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@55 -- # CONFIG_URING_ZNS=n 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@56 -- # CONFIG_WERROR=y 00:28:24.144 16:07:29 reap_unregistered_poller -- common/build_config.sh@57 -- # CONFIG_HAVE_LIBBSD=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@58 -- # CONFIG_UBSAN=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@59 -- # CONFIG_IPSEC_MB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@60 -- # CONFIG_GOLANG=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@61 -- # CONFIG_ISAL=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@62 -- # CONFIG_IDXD_KERNEL=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@63 -- # CONFIG_DPDK_LIB_DIR= 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@64 -- # CONFIG_RDMA_PROV=verbs 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@65 -- # CONFIG_APPS=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@66 -- # CONFIG_SHARED=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@67 -- # CONFIG_HAVE_KEYUTILS=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@68 -- # CONFIG_FC_PATH= 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@69 -- # CONFIG_DPDK_PKG_CONFIG=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@70 -- # CONFIG_FC=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@71 -- # CONFIG_AVAHI=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@72 -- # CONFIG_FIO_PLUGIN=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@73 -- # CONFIG_RAID5F=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@74 -- # CONFIG_EXAMPLES=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@75 -- # CONFIG_TESTS=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@76 -- # CONFIG_CRYPTO_MLX5=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@77 -- # CONFIG_MAX_LCORES= 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@78 -- # CONFIG_IPSEC_MB=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@79 -- # CONFIG_PGO_DIR= 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@80 -- # CONFIG_DEBUG=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@81 -- # CONFIG_DPDK_COMPRESSDEV=y 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@82 -- # CONFIG_CROSS_PREFIX= 00:28:24.404 16:07:29 reap_unregistered_poller -- common/build_config.sh@83 -- # CONFIG_URING=n 00:28:24.404 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@54 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@8 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/applications.sh 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@8 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@8 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/common 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@9 -- # _root=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@10 -- # _app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@11 -- # _test_app_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/app 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@12 -- # _examples_dir=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@14 -- # VHOST_FUZZ_APP=("$_test_app_dir/fuzz/vhost_fuzz/vhost_fuzz") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@15 -- # ISCSI_APP=("$_app_dir/iscsi_tgt") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@16 -- # NVMF_APP=("$_app_dir/nvmf_tgt") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@17 -- # VHOST_APP=("$_app_dir/vhost") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@18 -- # DD_APP=("$_app_dir/spdk_dd") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@19 -- # SPDK_APP=("$_app_dir/spdk_tgt") 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@22 -- # [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/include/spdk/config.h ]] 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@23 -- # [[ #ifndef SPDK_CONFIG_H 00:28:24.404 #define SPDK_CONFIG_H 00:28:24.404 #define SPDK_CONFIG_APPS 1 00:28:24.404 #define SPDK_CONFIG_ARCH native 00:28:24.404 #undef SPDK_CONFIG_ASAN 00:28:24.404 #undef SPDK_CONFIG_AVAHI 00:28:24.404 #undef SPDK_CONFIG_CET 00:28:24.404 #define SPDK_CONFIG_COVERAGE 1 00:28:24.404 #define SPDK_CONFIG_CROSS_PREFIX 00:28:24.404 #define SPDK_CONFIG_CRYPTO 1 00:28:24.404 #define SPDK_CONFIG_CRYPTO_MLX5 1 00:28:24.404 #undef SPDK_CONFIG_CUSTOMOCF 00:28:24.404 #undef SPDK_CONFIG_DAOS 00:28:24.404 #define SPDK_CONFIG_DAOS_DIR 00:28:24.404 #define SPDK_CONFIG_DEBUG 1 00:28:24.404 #define SPDK_CONFIG_DPDK_COMPRESSDEV 1 00:28:24.404 #define SPDK_CONFIG_DPDK_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build 00:28:24.404 #define SPDK_CONFIG_DPDK_INC_DIR 00:28:24.404 #define SPDK_CONFIG_DPDK_LIB_DIR 00:28:24.404 #undef SPDK_CONFIG_DPDK_PKG_CONFIG 00:28:24.404 #undef SPDK_CONFIG_DPDK_UADK 00:28:24.404 #define SPDK_CONFIG_ENV /var/jenkins/workspace/crypto-phy-autotest/spdk/lib/env_dpdk 00:28:24.404 #define SPDK_CONFIG_EXAMPLES 1 00:28:24.404 #undef SPDK_CONFIG_FC 00:28:24.404 #define SPDK_CONFIG_FC_PATH 00:28:24.404 #define SPDK_CONFIG_FIO_PLUGIN 1 00:28:24.404 #define SPDK_CONFIG_FIO_SOURCE_DIR /usr/src/fio 00:28:24.404 #undef SPDK_CONFIG_FUSE 00:28:24.404 #undef SPDK_CONFIG_FUZZER 00:28:24.404 #define SPDK_CONFIG_FUZZER_LIB 00:28:24.404 #undef SPDK_CONFIG_GOLANG 00:28:24.404 #define SPDK_CONFIG_HAVE_ARC4RANDOM 1 00:28:24.404 #define SPDK_CONFIG_HAVE_EVP_MAC 1 00:28:24.404 #define SPDK_CONFIG_HAVE_EXECINFO_H 1 00:28:24.404 #define SPDK_CONFIG_HAVE_KEYUTILS 1 00:28:24.404 #undef SPDK_CONFIG_HAVE_LIBARCHIVE 00:28:24.404 #undef SPDK_CONFIG_HAVE_LIBBSD 00:28:24.404 #define SPDK_CONFIG_HAVE_UUID_GENERATE_SHA1 1 00:28:24.404 #define SPDK_CONFIG_IDXD 1 00:28:24.404 #define SPDK_CONFIG_IDXD_KERNEL 1 00:28:24.404 #define SPDK_CONFIG_IPSEC_MB 1 00:28:24.404 #define SPDK_CONFIG_IPSEC_MB_DIR /var/jenkins/workspace/crypto-phy-autotest/spdk/intel-ipsec-mb/lib 00:28:24.404 #define SPDK_CONFIG_ISAL 1 00:28:24.404 #define SPDK_CONFIG_ISAL_CRYPTO 1 00:28:24.404 #define SPDK_CONFIG_ISCSI_INITIATOR 1 00:28:24.404 #define SPDK_CONFIG_LIBDIR 00:28:24.404 #undef SPDK_CONFIG_LTO 00:28:24.404 #define SPDK_CONFIG_MAX_LCORES 00:28:24.404 #define SPDK_CONFIG_NVME_CUSE 1 00:28:24.404 #undef SPDK_CONFIG_OCF 00:28:24.404 #define SPDK_CONFIG_OCF_PATH 00:28:24.404 #define SPDK_CONFIG_OPENSSL_PATH 00:28:24.404 #undef SPDK_CONFIG_PGO_CAPTURE 00:28:24.404 #define SPDK_CONFIG_PGO_DIR 00:28:24.404 #undef SPDK_CONFIG_PGO_USE 00:28:24.404 #define SPDK_CONFIG_PREFIX /usr/local 00:28:24.404 #undef SPDK_CONFIG_RAID5F 00:28:24.404 #undef SPDK_CONFIG_RBD 00:28:24.404 #define SPDK_CONFIG_RDMA 1 00:28:24.404 #define SPDK_CONFIG_RDMA_PROV verbs 00:28:24.404 #define SPDK_CONFIG_RDMA_SEND_WITH_INVAL 1 00:28:24.404 #define SPDK_CONFIG_RDMA_SET_ACK_TIMEOUT 1 00:28:24.404 #define SPDK_CONFIG_RDMA_SET_TOS 1 00:28:24.404 #define SPDK_CONFIG_SHARED 1 00:28:24.404 #undef SPDK_CONFIG_SMA 00:28:24.404 #define SPDK_CONFIG_TESTS 1 00:28:24.404 #undef SPDK_CONFIG_TSAN 00:28:24.404 #define SPDK_CONFIG_UBLK 1 00:28:24.404 #define SPDK_CONFIG_UBSAN 1 00:28:24.404 #undef SPDK_CONFIG_UNIT_TESTS 00:28:24.404 #undef SPDK_CONFIG_URING 00:28:24.404 #define SPDK_CONFIG_URING_PATH 00:28:24.404 #undef SPDK_CONFIG_URING_ZNS 00:28:24.404 #undef SPDK_CONFIG_USDT 00:28:24.404 #define SPDK_CONFIG_VBDEV_COMPRESS 1 00:28:24.404 #define SPDK_CONFIG_VBDEV_COMPRESS_MLX5 1 00:28:24.404 #undef SPDK_CONFIG_VFIO_USER 00:28:24.404 #define SPDK_CONFIG_VFIO_USER_DIR 00:28:24.404 #define SPDK_CONFIG_VHOST 1 00:28:24.404 #define SPDK_CONFIG_VIRTIO 1 00:28:24.404 #undef SPDK_CONFIG_VTUNE 00:28:24.404 #define SPDK_CONFIG_VTUNE_DIR 00:28:24.404 #define SPDK_CONFIG_WERROR 1 00:28:24.404 #define SPDK_CONFIG_WPDK_DIR 00:28:24.404 #undef SPDK_CONFIG_XNVME 00:28:24.404 #endif /* SPDK_CONFIG_H */ == *\#\d\e\f\i\n\e\ \S\P\D\K\_\C\O\N\F\I\G\_\D\E\B\U\G* ]] 00:28:24.404 16:07:29 reap_unregistered_poller -- common/applications.sh@24 -- # (( SPDK_AUTOTEST_DEBUG_APPS )) 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@55 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:24.405 16:07:29 reap_unregistered_poller -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:24.405 16:07:29 reap_unregistered_poller -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:24.405 16:07:29 reap_unregistered_poller -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:24.405 16:07:29 reap_unregistered_poller -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:24.405 16:07:29 reap_unregistered_poller -- paths/export.sh@5 -- # export PATH 00:28:24.405 16:07:29 reap_unregistered_poller -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@56 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@6 -- # dirname /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/common 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@6 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@6 -- # _pmdir=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@7 -- # readlink -f /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/../../../ 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@7 -- # _pmrootdir=/var/jenkins/workspace/crypto-phy-autotest/spdk 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@64 -- # TEST_TAG=N/A 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@65 -- # TEST_TAG_FILE=/var/jenkins/workspace/crypto-phy-autotest/spdk/.run_test_name 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@67 -- # PM_OUTPUTDIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@68 -- # uname -s 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@68 -- # PM_OS=Linux 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@70 -- # MONITOR_RESOURCES_SUDO=() 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@70 -- # declare -A MONITOR_RESOURCES_SUDO 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@71 -- # MONITOR_RESOURCES_SUDO["collect-bmc-pm"]=1 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@72 -- # MONITOR_RESOURCES_SUDO["collect-cpu-load"]=0 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@73 -- # MONITOR_RESOURCES_SUDO["collect-cpu-temp"]=0 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@74 -- # MONITOR_RESOURCES_SUDO["collect-vmstat"]=0 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@76 -- # SUDO[0]= 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@76 -- # SUDO[1]='sudo -E' 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@78 -- # MONITOR_RESOURCES=(collect-cpu-load collect-vmstat) 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@79 -- # [[ Linux == FreeBSD ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@81 -- # [[ Linux == Linux ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@81 -- # [[ ............................... != QEMU ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@81 -- # [[ ! -e /.dockerenv ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@84 -- # MONITOR_RESOURCES+=(collect-cpu-temp) 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@85 -- # MONITOR_RESOURCES+=(collect-bmc-pm) 00:28:24.405 16:07:29 reap_unregistered_poller -- pm/common@88 -- # [[ ! -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power ]] 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@58 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@59 -- # export RUN_NIGHTLY 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@62 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@63 -- # export SPDK_AUTOTEST_DEBUG_APPS 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@64 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@65 -- # export SPDK_RUN_VALGRIND 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@66 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@67 -- # export SPDK_RUN_FUNCTIONAL_TEST 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@68 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@69 -- # export SPDK_TEST_UNITTEST 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@70 -- # : 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@71 -- # export SPDK_TEST_AUTOBUILD 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@72 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@73 -- # export SPDK_TEST_RELEASE_BUILD 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@74 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@75 -- # export SPDK_TEST_ISAL 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@76 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@77 -- # export SPDK_TEST_ISCSI 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@78 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@79 -- # export SPDK_TEST_ISCSI_INITIATOR 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@80 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@81 -- # export SPDK_TEST_NVME 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@82 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@83 -- # export SPDK_TEST_NVME_PMR 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@84 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@85 -- # export SPDK_TEST_NVME_BP 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@86 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@87 -- # export SPDK_TEST_NVME_CLI 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@88 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@89 -- # export SPDK_TEST_NVME_CUSE 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@90 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@91 -- # export SPDK_TEST_NVME_FDP 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@92 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@93 -- # export SPDK_TEST_NVMF 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@94 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@95 -- # export SPDK_TEST_VFIOUSER 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@96 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@97 -- # export SPDK_TEST_VFIOUSER_QEMU 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@98 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@99 -- # export SPDK_TEST_FUZZER 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@100 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@101 -- # export SPDK_TEST_FUZZER_SHORT 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@102 -- # : rdma 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@103 -- # export SPDK_TEST_NVMF_TRANSPORT 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@104 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@105 -- # export SPDK_TEST_RBD 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@106 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@107 -- # export SPDK_TEST_VHOST 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@108 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@109 -- # export SPDK_TEST_BLOCKDEV 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@110 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@111 -- # export SPDK_TEST_IOAT 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@112 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@113 -- # export SPDK_TEST_BLOBFS 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@114 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@115 -- # export SPDK_TEST_VHOST_INIT 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@116 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@117 -- # export SPDK_TEST_LVOL 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@118 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@119 -- # export SPDK_TEST_VBDEV_COMPRESS 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@120 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@121 -- # export SPDK_RUN_ASAN 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@122 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@123 -- # export SPDK_RUN_UBSAN 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@124 -- # : 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@125 -- # export SPDK_RUN_EXTERNAL_DPDK 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@126 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@127 -- # export SPDK_RUN_NON_ROOT 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@128 -- # : 1 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@129 -- # export SPDK_TEST_CRYPTO 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@130 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@131 -- # export SPDK_TEST_FTL 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@132 -- # : 0 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@133 -- # export SPDK_TEST_OCF 00:28:24.405 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@134 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@135 -- # export SPDK_TEST_VMD 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@136 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@137 -- # export SPDK_TEST_OPAL 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@138 -- # : 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@139 -- # export SPDK_TEST_NATIVE_DPDK 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@140 -- # : true 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@141 -- # export SPDK_AUTOTEST_X 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@142 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@143 -- # export SPDK_TEST_RAID5 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@144 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@145 -- # export SPDK_TEST_URING 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@146 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@147 -- # export SPDK_TEST_USDT 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@148 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@149 -- # export SPDK_TEST_USE_IGB_UIO 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@150 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@151 -- # export SPDK_TEST_SCHEDULER 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@152 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@153 -- # export SPDK_TEST_SCANBUILD 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@154 -- # : 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@155 -- # export SPDK_TEST_NVMF_NICS 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@156 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@157 -- # export SPDK_TEST_SMA 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@158 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@159 -- # export SPDK_TEST_DAOS 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@160 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@161 -- # export SPDK_TEST_XNVME 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@162 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@163 -- # export SPDK_TEST_ACCEL_DSA 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@164 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@165 -- # export SPDK_TEST_ACCEL_IAA 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@167 -- # : 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@168 -- # export SPDK_TEST_FUZZER_TARGET 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@169 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@170 -- # export SPDK_TEST_NVMF_MDNS 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@171 -- # : 0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@172 -- # export SPDK_JSONRPC_GO_CLIENT 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@175 -- # export SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@175 -- # SPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@176 -- # export DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@176 -- # DPDK_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@177 -- # export VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@177 -- # VFIO_LIB_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@178 -- # export LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@178 -- # LD_LIBRARY_PATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/build/lib:/var/jenkins/workspace/crypto-phy-autotest/spdk/build/libvfio-user/usr/local/lib 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@181 -- # export PCI_BLOCK_SYNC_ON_RESET=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@181 -- # PCI_BLOCK_SYNC_ON_RESET=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@185 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@185 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@189 -- # export PYTHONDONTWRITEBYTECODE=1 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@189 -- # PYTHONDONTWRITEBYTECODE=1 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@193 -- # export ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@193 -- # ASAN_OPTIONS=new_delete_type_mismatch=0:disable_coredump=0:abort_on_error=1:use_sigaltstack=0 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@194 -- # export UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@194 -- # UBSAN_OPTIONS=halt_on_error=1:print_stacktrace=1:abort_on_error=1:disable_coredump=0:exitcode=134 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@198 -- # asan_suppression_file=/var/tmp/asan_suppression_file 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@199 -- # rm -rf /var/tmp/asan_suppression_file 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@200 -- # cat 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@236 -- # echo leak:libfuse3.so 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@238 -- # export LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@238 -- # LSAN_OPTIONS=suppressions=/var/tmp/asan_suppression_file 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@240 -- # export DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@240 -- # DEFAULT_RPC_ADDR=/var/tmp/spdk.sock 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@242 -- # '[' -z /var/spdk/dependencies ']' 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@245 -- # export DEPENDENCY_DIR 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@249 -- # export SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@249 -- # SPDK_BIN_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@250 -- # export SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@250 -- # SPDK_EXAMPLE_DIR=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@253 -- # export QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@253 -- # QEMU_BIN=/usr/local/qemu/vanilla-latest/bin/qemu-system-x86_64 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@254 -- # export VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@254 -- # VFIO_QEMU_BIN=/usr/local/qemu/vfio-user-latest/bin/qemu-system-x86_64 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@256 -- # export AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@256 -- # AR_TOOL=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/ar-xnvme-fixer 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@259 -- # export UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@259 -- # UNBIND_ENTIRE_IOMMU_GROUP=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@262 -- # '[' 0 -eq 0 ']' 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@263 -- # export valgrind= 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@263 -- # valgrind= 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@269 -- # uname -s 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@269 -- # '[' Linux = Linux ']' 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@270 -- # HUGEMEM=4096 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@271 -- # export CLEAR_HUGE=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@271 -- # CLEAR_HUGE=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@272 -- # [[ 1 -eq 1 ]] 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@276 -- # export HUGE_EVEN_ALLOC=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@276 -- # HUGE_EVEN_ALLOC=yes 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@279 -- # MAKE=make 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@280 -- # MAKEFLAGS=-j96 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@296 -- # export HUGEMEM=4096 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@296 -- # HUGEMEM=4096 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@298 -- # NO_HUGE=() 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@299 -- # TEST_MODE= 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@318 -- # [[ -z 2836950 ]] 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@318 -- # kill -0 2836950 00:28:24.406 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1679 -- # set_test_storage 2147483648 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@328 -- # [[ -v testdir ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@330 -- # local requested_size=2147483648 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@331 -- # local mount target_dir 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@333 -- # local -A mounts fss sizes avails uses 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@334 -- # local source fs size avail mount use 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@336 -- # local storage_fallback storage_candidates 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@338 -- # mktemp -udt spdk.XXXXXX 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@338 -- # storage_fallback=/tmp/spdk.5TjuoR 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@343 -- # storage_candidates=("$testdir" "$storage_fallback/tests/${testdir##*/}" "$storage_fallback") 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@345 -- # [[ -n '' ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@350 -- # [[ -n '' ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@355 -- # mkdir -p /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt /tmp/spdk.5TjuoR/tests/interrupt /tmp/spdk.5TjuoR 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@358 -- # requested_size=2214592512 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@327 -- # df -T 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@327 -- # grep -v Filesystem 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_devtmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=devtmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=67108864 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=67108864 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=0 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=/dev/pmem0 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=ext2 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=900243456 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=5284429824 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4384186368 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=spdk_root 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=overlay 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=89242013696 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=95562715136 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=6320701440 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47777980416 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781355520 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=3375104 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=19102957568 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=19112546304 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=9588736 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=47780691968 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=47781359616 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=667648 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # mounts["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@361 -- # fss["$mount"]=tmpfs 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # avails["$mount"]=9556267008 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@362 -- # sizes["$mount"]=9556271104 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@363 -- # uses["$mount"]=4096 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@360 -- # read -r source fs size use avail _ mount 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@366 -- # printf '* Looking for test storage...\n' 00:28:24.407 * Looking for test storage... 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@368 -- # local target_space new_size 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@369 -- # for target_dir in "${storage_candidates[@]}" 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@372 -- # df /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@372 -- # awk '$1 !~ /Filesystem/{print $6}' 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@372 -- # mount=/ 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@374 -- # target_space=89242013696 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@375 -- # (( target_space == 0 || target_space < requested_size )) 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@378 -- # (( target_space >= requested_size )) 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == tmpfs ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ overlay == ramfs ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@380 -- # [[ / == / ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@381 -- # new_size=8535293952 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@382 -- # (( new_size * 100 / sizes[/] > 95 )) 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@387 -- # export SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@387 -- # SPDK_TEST_STORAGE=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@388 -- # printf '* Found test storage at %s\n' /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.407 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@389 -- # return 0 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1681 -- # set -o errtrace 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1682 -- # shopt -s extdebug 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1683 -- # trap 'trap - ERR; print_backtrace >&2' ERR 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1685 -- # PS4=' \t $test_domain -- ${BASH_SOURCE#${BASH_SOURCE%/*/*}/}@${LINENO} -- \$ ' 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1686 -- # true 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@1688 -- # xtrace_fd 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -n 13 ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@25 -- # [[ -e /proc/self/fd/13 ]] 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@27 -- # exec 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@29 -- # exec 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@31 -- # xtrace_restore 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@16 -- # unset -v 'X_STACK[0 - 1 < 0 ? 0 : 0 - 1]' 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@17 -- # (( 0 == 0 )) 00:28:24.407 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@18 -- # set -x 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@8 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/common.sh 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@10 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@12 -- # r0_mask=0x1 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@13 -- # r1_mask=0x2 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@14 -- # r2_mask=0x4 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@16 -- # cpu_server_mask=0x07 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@17 -- # rpc_server_addr=/var/tmp/spdk.sock 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # export PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@14 -- # PYTHONPATH=:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/test/rpc_plugins:/var/jenkins/workspace/crypto-phy-autotest/spdk/python:/var/jenkins/workspace/crypto-phy-autotest/spdk/examples/interrupt_tgt 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@17 -- # start_intr_tgt 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@20 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@21 -- # local cpu_mask=0x07 00:28:24.407 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@24 -- # intr_tgt_pid=2837070 00:28:24.408 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@25 -- # trap 'killprocess "$intr_tgt_pid"; cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:24.408 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@23 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/interrupt_tgt -m 0x07 -r /var/tmp/spdk.sock -E -g 00:28:24.408 16:07:29 reap_unregistered_poller -- interrupt/interrupt_common.sh@26 -- # waitforlisten 2837070 /var/tmp/spdk.sock 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@830 -- # '[' -z 2837070 ']' 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:24.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:24.408 16:07:29 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:24.408 [2024-06-10 16:07:29.827629] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:24.408 [2024-06-10 16:07:29.827687] [ DPDK EAL parameters: interrupt_tgt --no-shconf -c 0x07 --single-file-segments --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2837070 ] 00:28:24.667 [2024-06-10 16:07:29.927883] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:28:24.667 [2024-06-10 16:07:30.025436] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:24.667 [2024-06-10 16:07:30.025534] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:24.667 [2024-06-10 16:07:30.025538] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:28:24.667 [2024-06-10 16:07:30.096169] thread.c:2095:spdk_thread_set_interrupt_mode: *NOTICE*: Set spdk_thread (app_thread) to intr mode from intr mode. 00:28:24.667 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:24.667 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@863 -- # return 0 00:28:24.667 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # jq -r '.threads[0]' 00:28:24.667 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # rpc_cmd thread_get_pollers 00:28:24.667 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:24.667 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:24.667 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:24.667 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@20 -- # app_thread='{ 00:28:24.667 "name": "app_thread", 00:28:24.667 "id": 1, 00:28:24.667 "active_pollers": [], 00:28:24.667 "timed_pollers": [ 00:28:24.667 { 00:28:24.667 "name": "rpc_subsystem_poll_servers", 00:28:24.667 "id": 1, 00:28:24.667 "state": "waiting", 00:28:24.667 "run_count": 0, 00:28:24.667 "busy_count": 0, 00:28:24.667 "period_ticks": 8400000 00:28:24.667 } 00:28:24.667 ], 00:28:24.667 "paused_pollers": [] 00:28:24.667 }' 00:28:24.667 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # jq -r '.active_pollers[].name' 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@21 -- # native_pollers= 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@22 -- # native_pollers+=' ' 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # jq -r '.timed_pollers[].name' 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@23 -- # native_pollers+=rpc_subsystem_poll_servers 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@28 -- # setup_bdev_aio 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/common.sh@75 -- # uname -s 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/common.sh@75 -- # [[ Linux != \F\r\e\e\B\S\D ]] 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/common.sh@76 -- # dd if=/dev/zero of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile bs=2048 count=5000 00:28:24.925 5000+0 records in 00:28:24.925 5000+0 records out 00:28:24.925 10240000 bytes (10 MB, 9.8 MiB) copied, 0.0173444 s, 590 MB/s 00:28:24.925 16:07:30 reap_unregistered_poller -- interrupt/common.sh@77 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_aio_create /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile AIO0 2048 00:28:25.184 AIO0 00:28:25.184 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:25.442 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@34 -- # sleep 0.1 00:28:25.442 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # rpc_cmd thread_get_pollers 00:28:25.442 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # jq -r '.threads[0]' 00:28:25.442 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@560 -- # xtrace_disable 00:28:25.442 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:25.443 16:07:30 reap_unregistered_poller -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:28:25.443 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@37 -- # app_thread='{ 00:28:25.443 "name": "app_thread", 00:28:25.443 "id": 1, 00:28:25.443 "active_pollers": [], 00:28:25.443 "timed_pollers": [ 00:28:25.443 { 00:28:25.443 "name": "rpc_subsystem_poll_servers", 00:28:25.443 "id": 1, 00:28:25.443 "state": "waiting", 00:28:25.443 "run_count": 0, 00:28:25.443 "busy_count": 0, 00:28:25.443 "period_ticks": 8400000 00:28:25.443 } 00:28:25.443 ], 00:28:25.443 "paused_pollers": [] 00:28:25.443 }' 00:28:25.443 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # jq -r '.active_pollers[].name' 00:28:25.702 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@38 -- # remaining_pollers= 00:28:25.702 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@39 -- # remaining_pollers+=' ' 00:28:25.702 16:07:30 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # jq -r '.timed_pollers[].name' 00:28:25.702 16:07:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@40 -- # remaining_pollers+=rpc_subsystem_poll_servers 00:28:25.702 16:07:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@44 -- # [[ rpc_subsystem_poll_servers == \ \r\p\c\_\s\u\b\s\y\s\t\e\m\_\p\o\l\l\_\s\e\r\v\e\r\s ]] 00:28:25.702 16:07:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@46 -- # trap - SIGINT SIGTERM EXIT 00:28:25.702 16:07:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@47 -- # killprocess 2837070 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@949 -- # '[' -z 2837070 ']' 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@953 -- # kill -0 2837070 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@954 -- # uname 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2837070 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2837070' 00:28:25.702 killing process with pid 2837070 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@968 -- # kill 2837070 00:28:25.702 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@973 -- # wait 2837070 00:28:25.961 16:07:31 reap_unregistered_poller -- interrupt/reap_unregistered_poller.sh@48 -- # cleanup 00:28:25.961 16:07:31 reap_unregistered_poller -- interrupt/common.sh@6 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/interrupt/aiofile 00:28:25.961 00:28:25.961 real 0m1.742s 00:28:25.961 user 0m1.399s 00:28:25.961 sys 0m0.482s 00:28:25.961 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@1125 -- # xtrace_disable 00:28:25.961 16:07:31 reap_unregistered_poller -- common/autotest_common.sh@10 -- # set +x 00:28:25.961 ************************************ 00:28:25.961 END TEST reap_unregistered_poller 00:28:25.961 ************************************ 00:28:25.961 16:07:31 -- spdk/autotest.sh@198 -- # uname -s 00:28:25.961 16:07:31 -- spdk/autotest.sh@198 -- # [[ Linux == Linux ]] 00:28:25.961 16:07:31 -- spdk/autotest.sh@199 -- # [[ 1 -eq 1 ]] 00:28:25.961 16:07:31 -- spdk/autotest.sh@205 -- # [[ 1 -eq 0 ]] 00:28:25.961 16:07:31 -- spdk/autotest.sh@211 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@256 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@260 -- # timing_exit lib 00:28:25.961 16:07:31 -- common/autotest_common.sh@729 -- # xtrace_disable 00:28:25.961 16:07:31 -- common/autotest_common.sh@10 -- # set +x 00:28:25.961 16:07:31 -- spdk/autotest.sh@262 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@270 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@279 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@308 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@312 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@316 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@321 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@330 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@335 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@339 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@343 -- # '[' 0 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@347 -- # '[' 1 -eq 1 ']' 00:28:25.961 16:07:31 -- spdk/autotest.sh@348 -- # run_test compress_compdev /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:25.961 16:07:31 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:28:25.961 16:07:31 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:28:25.961 16:07:31 -- common/autotest_common.sh@10 -- # set +x 00:28:25.961 ************************************ 00:28:25.961 START TEST compress_compdev 00:28:25.961 ************************************ 00:28:25.961 16:07:31 compress_compdev -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh compdev 00:28:26.220 * Looking for test storage... 00:28:26.220 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:28:26.220 16:07:31 compress_compdev -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@7 -- # uname -s 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:28:26.220 16:07:31 compress_compdev -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:28:26.221 16:07:31 compress_compdev -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:28:26.221 16:07:31 compress_compdev -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:28:26.221 16:07:31 compress_compdev -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:28:26.221 16:07:31 compress_compdev -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:26.221 16:07:31 compress_compdev -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:26.221 16:07:31 compress_compdev -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:26.221 16:07:31 compress_compdev -- paths/export.sh@5 -- # export PATH 00:28:26.221 16:07:31 compress_compdev -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@47 -- # : 0 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:28:26.221 16:07:31 compress_compdev -- nvmf/common.sh@51 -- # have_pci_nics=0 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@82 -- # test_type=compdev 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2837522 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2837522 00:28:26.221 16:07:31 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 2837522 ']' 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:26.221 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:26.221 16:07:31 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:26.221 [2024-06-10 16:07:31.575291] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:26.221 [2024-06-10 16:07:31.575353] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2837522 ] 00:28:26.221 [2024-06-10 16:07:31.666665] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:26.479 [2024-06-10 16:07:31.761356] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:26.479 [2024-06-10 16:07:31.761363] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:27.046 [2024-06-10 16:07:32.316068] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:27.046 16:07:32 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:27.046 16:07:32 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:28:27.046 16:07:32 compress_compdev -- compress/compress.sh@74 -- # create_vols 00:28:27.046 16:07:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:27.046 16:07:32 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:30.331 [2024-06-10 16:07:35.479017] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1cc7e80 PMD being used: compress_qat 00:28:30.331 16:07:35 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:30.331 16:07:35 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:30.625 [ 00:28:30.625 { 00:28:30.625 "name": "Nvme0n1", 00:28:30.625 "aliases": [ 00:28:30.625 "afdc7025-0aff-41ed-afe6-bb46086a88fd" 00:28:30.625 ], 00:28:30.625 "product_name": "NVMe disk", 00:28:30.625 "block_size": 512, 00:28:30.625 "num_blocks": 1953525168, 00:28:30.625 "uuid": "afdc7025-0aff-41ed-afe6-bb46086a88fd", 00:28:30.625 "assigned_rate_limits": { 00:28:30.625 "rw_ios_per_sec": 0, 00:28:30.625 "rw_mbytes_per_sec": 0, 00:28:30.625 "r_mbytes_per_sec": 0, 00:28:30.625 "w_mbytes_per_sec": 0 00:28:30.625 }, 00:28:30.625 "claimed": false, 00:28:30.625 "zoned": false, 00:28:30.625 "supported_io_types": { 00:28:30.625 "read": true, 00:28:30.625 "write": true, 00:28:30.625 "unmap": true, 00:28:30.625 "write_zeroes": true, 00:28:30.625 "flush": true, 00:28:30.625 "reset": true, 00:28:30.625 "compare": false, 00:28:30.625 "compare_and_write": false, 00:28:30.625 "abort": true, 00:28:30.625 "nvme_admin": true, 00:28:30.625 "nvme_io": true 00:28:30.625 }, 00:28:30.625 "driver_specific": { 00:28:30.625 "nvme": [ 00:28:30.625 { 00:28:30.625 "pci_address": "0000:5e:00.0", 00:28:30.625 "trid": { 00:28:30.625 "trtype": "PCIe", 00:28:30.625 "traddr": "0000:5e:00.0" 00:28:30.625 }, 00:28:30.625 "ctrlr_data": { 00:28:30.625 "cntlid": 0, 00:28:30.625 "vendor_id": "0x8086", 00:28:30.625 "model_number": "INTEL SSDPE2KX010T8", 00:28:30.625 "serial_number": "BTLJ807001JM1P0FGN", 00:28:30.625 "firmware_revision": "VDV10170", 00:28:30.625 "oacs": { 00:28:30.625 "security": 1, 00:28:30.625 "format": 1, 00:28:30.625 "firmware": 1, 00:28:30.625 "ns_manage": 1 00:28:30.625 }, 00:28:30.625 "multi_ctrlr": false, 00:28:30.625 "ana_reporting": false 00:28:30.625 }, 00:28:30.625 "vs": { 00:28:30.625 "nvme_version": "1.2" 00:28:30.625 }, 00:28:30.625 "ns_data": { 00:28:30.625 "id": 1, 00:28:30.625 "can_share": false 00:28:30.625 }, 00:28:30.625 "security": { 00:28:30.625 "opal": true 00:28:30.625 } 00:28:30.625 } 00:28:30.625 ], 00:28:30.625 "mp_policy": "active_passive" 00:28:30.625 } 00:28:30.625 } 00:28:30.625 ] 00:28:30.625 16:07:36 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:30.625 16:07:36 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:30.884 [2024-06-10 16:07:36.176551] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1b16350 PMD being used: compress_qat 00:28:31.821 21610267-79aa-4c07-a620-a472861fb7c3 00:28:31.821 16:07:37 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:31.821 65cc5568-7403-4731-b9b5-053de92097e1 00:28:31.821 16:07:37 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:31.821 16:07:37 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:32.079 16:07:37 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:32.338 [ 00:28:32.338 { 00:28:32.338 "name": "65cc5568-7403-4731-b9b5-053de92097e1", 00:28:32.338 "aliases": [ 00:28:32.338 "lvs0/lv0" 00:28:32.338 ], 00:28:32.338 "product_name": "Logical Volume", 00:28:32.338 "block_size": 512, 00:28:32.338 "num_blocks": 204800, 00:28:32.338 "uuid": "65cc5568-7403-4731-b9b5-053de92097e1", 00:28:32.338 "assigned_rate_limits": { 00:28:32.338 "rw_ios_per_sec": 0, 00:28:32.338 "rw_mbytes_per_sec": 0, 00:28:32.338 "r_mbytes_per_sec": 0, 00:28:32.338 "w_mbytes_per_sec": 0 00:28:32.338 }, 00:28:32.338 "claimed": false, 00:28:32.338 "zoned": false, 00:28:32.338 "supported_io_types": { 00:28:32.338 "read": true, 00:28:32.338 "write": true, 00:28:32.338 "unmap": true, 00:28:32.338 "write_zeroes": true, 00:28:32.338 "flush": false, 00:28:32.338 "reset": true, 00:28:32.338 "compare": false, 00:28:32.338 "compare_and_write": false, 00:28:32.338 "abort": false, 00:28:32.338 "nvme_admin": false, 00:28:32.338 "nvme_io": false 00:28:32.338 }, 00:28:32.338 "driver_specific": { 00:28:32.338 "lvol": { 00:28:32.338 "lvol_store_uuid": "21610267-79aa-4c07-a620-a472861fb7c3", 00:28:32.338 "base_bdev": "Nvme0n1", 00:28:32.338 "thin_provision": true, 00:28:32.338 "num_allocated_clusters": 0, 00:28:32.338 "snapshot": false, 00:28:32.338 "clone": false, 00:28:32.338 "esnap_clone": false 00:28:32.338 } 00:28:32.338 } 00:28:32.338 } 00:28:32.338 ] 00:28:32.338 16:07:37 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:32.338 16:07:37 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:28:32.338 16:07:37 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:28:32.597 [2024-06-10 16:07:38.039372] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:32.597 COMP_lvs0/lv0 00:28:32.597 16:07:38 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:32.597 16:07:38 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:32.856 16:07:38 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:33.115 [ 00:28:33.115 { 00:28:33.115 "name": "COMP_lvs0/lv0", 00:28:33.115 "aliases": [ 00:28:33.115 "898f6127-cf9c-5740-ab05-38caae453e2f" 00:28:33.115 ], 00:28:33.115 "product_name": "compress", 00:28:33.115 "block_size": 512, 00:28:33.115 "num_blocks": 200704, 00:28:33.115 "uuid": "898f6127-cf9c-5740-ab05-38caae453e2f", 00:28:33.115 "assigned_rate_limits": { 00:28:33.115 "rw_ios_per_sec": 0, 00:28:33.115 "rw_mbytes_per_sec": 0, 00:28:33.115 "r_mbytes_per_sec": 0, 00:28:33.115 "w_mbytes_per_sec": 0 00:28:33.115 }, 00:28:33.115 "claimed": false, 00:28:33.115 "zoned": false, 00:28:33.115 "supported_io_types": { 00:28:33.115 "read": true, 00:28:33.115 "write": true, 00:28:33.115 "unmap": false, 00:28:33.115 "write_zeroes": true, 00:28:33.115 "flush": false, 00:28:33.115 "reset": false, 00:28:33.115 "compare": false, 00:28:33.115 "compare_and_write": false, 00:28:33.115 "abort": false, 00:28:33.115 "nvme_admin": false, 00:28:33.115 "nvme_io": false 00:28:33.115 }, 00:28:33.115 "driver_specific": { 00:28:33.115 "compress": { 00:28:33.115 "name": "COMP_lvs0/lv0", 00:28:33.115 "base_bdev_name": "65cc5568-7403-4731-b9b5-053de92097e1" 00:28:33.115 } 00:28:33.115 } 00:28:33.115 } 00:28:33.115 ] 00:28:33.115 16:07:38 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:33.115 16:07:38 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:33.374 [2024-06-10 16:07:38.657553] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f5f441b15c0 PMD being used: compress_qat 00:28:33.374 [2024-06-10 16:07:38.659566] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1ebaa00 PMD being used: compress_qat 00:28:33.374 Running I/O for 3 seconds... 00:28:36.661 00:28:36.661 Latency(us) 00:28:36.661 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:36.661 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:36.661 Verification LBA range: start 0x0 length 0x3100 00:28:36.661 COMP_lvs0/lv0 : 3.01 3998.25 15.62 0.00 0.00 7948.31 129.71 13481.69 00:28:36.661 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:36.661 Verification LBA range: start 0x3100 length 0x3100 00:28:36.662 COMP_lvs0/lv0 : 3.00 4097.09 16.00 0.00 0.00 7774.40 118.98 14293.09 00:28:36.662 =================================================================================================================== 00:28:36.662 Total : 8095.34 31.62 0.00 0.00 7860.32 118.98 14293.09 00:28:36.662 0 00:28:36.662 16:07:41 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:36.662 16:07:41 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:36.662 16:07:41 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:36.662 16:07:42 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:36.662 16:07:42 compress_compdev -- compress/compress.sh@78 -- # killprocess 2837522 00:28:36.662 16:07:42 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 2837522 ']' 00:28:36.662 16:07:42 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 2837522 00:28:36.662 16:07:42 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:28:36.662 16:07:42 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:36.662 16:07:42 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2837522 00:28:36.920 16:07:42 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:28:36.920 16:07:42 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:28:36.920 16:07:42 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2837522' 00:28:36.920 killing process with pid 2837522 00:28:36.920 16:07:42 compress_compdev -- common/autotest_common.sh@968 -- # kill 2837522 00:28:36.920 Received shutdown signal, test time was about 3.000000 seconds 00:28:36.920 00:28:36.920 Latency(us) 00:28:36.920 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:36.921 =================================================================================================================== 00:28:36.921 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:36.921 16:07:42 compress_compdev -- common/autotest_common.sh@973 -- # wait 2837522 00:28:38.298 16:07:43 compress_compdev -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:28:38.298 16:07:43 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:38.298 16:07:43 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2839369 00:28:38.298 16:07:43 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:38.299 16:07:43 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:38.299 16:07:43 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2839369 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 2839369 ']' 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:38.299 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:38.299 16:07:43 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:38.299 [2024-06-10 16:07:43.752695] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:38.299 [2024-06-10 16:07:43.752760] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2839369 ] 00:28:38.558 [2024-06-10 16:07:43.843531] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:38.558 [2024-06-10 16:07:43.938707] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:38.558 [2024-06-10 16:07:43.938713] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:39.126 [2024-06-10 16:07:44.498282] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:39.126 16:07:44 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:39.126 16:07:44 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:28:39.126 16:07:44 compress_compdev -- compress/compress.sh@74 -- # create_vols 512 00:28:39.126 16:07:44 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:39.126 16:07:44 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:42.415 [2024-06-10 16:07:47.662944] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf2ae80 PMD being used: compress_qat 00:28:42.415 16:07:47 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:42.415 16:07:47 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:42.674 16:07:47 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:42.932 [ 00:28:42.932 { 00:28:42.932 "name": "Nvme0n1", 00:28:42.932 "aliases": [ 00:28:42.932 "0a66f792-5b31-4c1e-a034-af503624e1b2" 00:28:42.932 ], 00:28:42.932 "product_name": "NVMe disk", 00:28:42.932 "block_size": 512, 00:28:42.932 "num_blocks": 1953525168, 00:28:42.932 "uuid": "0a66f792-5b31-4c1e-a034-af503624e1b2", 00:28:42.932 "assigned_rate_limits": { 00:28:42.932 "rw_ios_per_sec": 0, 00:28:42.932 "rw_mbytes_per_sec": 0, 00:28:42.932 "r_mbytes_per_sec": 0, 00:28:42.932 "w_mbytes_per_sec": 0 00:28:42.932 }, 00:28:42.932 "claimed": false, 00:28:42.932 "zoned": false, 00:28:42.932 "supported_io_types": { 00:28:42.932 "read": true, 00:28:42.932 "write": true, 00:28:42.932 "unmap": true, 00:28:42.932 "write_zeroes": true, 00:28:42.932 "flush": true, 00:28:42.932 "reset": true, 00:28:42.932 "compare": false, 00:28:42.932 "compare_and_write": false, 00:28:42.932 "abort": true, 00:28:42.932 "nvme_admin": true, 00:28:42.932 "nvme_io": true 00:28:42.932 }, 00:28:42.932 "driver_specific": { 00:28:42.932 "nvme": [ 00:28:42.932 { 00:28:42.932 "pci_address": "0000:5e:00.0", 00:28:42.932 "trid": { 00:28:42.932 "trtype": "PCIe", 00:28:42.932 "traddr": "0000:5e:00.0" 00:28:42.932 }, 00:28:42.932 "ctrlr_data": { 00:28:42.932 "cntlid": 0, 00:28:42.932 "vendor_id": "0x8086", 00:28:42.932 "model_number": "INTEL SSDPE2KX010T8", 00:28:42.932 "serial_number": "BTLJ807001JM1P0FGN", 00:28:42.932 "firmware_revision": "VDV10170", 00:28:42.932 "oacs": { 00:28:42.932 "security": 1, 00:28:42.932 "format": 1, 00:28:42.932 "firmware": 1, 00:28:42.932 "ns_manage": 1 00:28:42.932 }, 00:28:42.932 "multi_ctrlr": false, 00:28:42.932 "ana_reporting": false 00:28:42.932 }, 00:28:42.932 "vs": { 00:28:42.932 "nvme_version": "1.2" 00:28:42.932 }, 00:28:42.932 "ns_data": { 00:28:42.932 "id": 1, 00:28:42.932 "can_share": false 00:28:42.932 }, 00:28:42.932 "security": { 00:28:42.932 "opal": true 00:28:42.932 } 00:28:42.932 } 00:28:42.932 ], 00:28:42.932 "mp_policy": "active_passive" 00:28:42.932 } 00:28:42.932 } 00:28:42.932 ] 00:28:42.932 16:07:48 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:42.932 16:07:48 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:42.932 [2024-06-10 16:07:48.436640] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf2bdb0 PMD being used: compress_qat 00:28:43.868 c4cb10e3-b5ab-4d48-a0a4-351da7da59f1 00:28:43.868 16:07:49 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:44.127 6dd6ba39-9eab-4c95-b3e2-f544a3a57a8f 00:28:44.127 16:07:49 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:44.127 16:07:49 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:44.386 16:07:49 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:44.645 [ 00:28:44.645 { 00:28:44.645 "name": "6dd6ba39-9eab-4c95-b3e2-f544a3a57a8f", 00:28:44.645 "aliases": [ 00:28:44.645 "lvs0/lv0" 00:28:44.645 ], 00:28:44.645 "product_name": "Logical Volume", 00:28:44.645 "block_size": 512, 00:28:44.645 "num_blocks": 204800, 00:28:44.645 "uuid": "6dd6ba39-9eab-4c95-b3e2-f544a3a57a8f", 00:28:44.645 "assigned_rate_limits": { 00:28:44.645 "rw_ios_per_sec": 0, 00:28:44.645 "rw_mbytes_per_sec": 0, 00:28:44.645 "r_mbytes_per_sec": 0, 00:28:44.645 "w_mbytes_per_sec": 0 00:28:44.645 }, 00:28:44.645 "claimed": false, 00:28:44.645 "zoned": false, 00:28:44.645 "supported_io_types": { 00:28:44.645 "read": true, 00:28:44.645 "write": true, 00:28:44.645 "unmap": true, 00:28:44.645 "write_zeroes": true, 00:28:44.645 "flush": false, 00:28:44.645 "reset": true, 00:28:44.645 "compare": false, 00:28:44.645 "compare_and_write": false, 00:28:44.645 "abort": false, 00:28:44.645 "nvme_admin": false, 00:28:44.645 "nvme_io": false 00:28:44.645 }, 00:28:44.645 "driver_specific": { 00:28:44.645 "lvol": { 00:28:44.645 "lvol_store_uuid": "c4cb10e3-b5ab-4d48-a0a4-351da7da59f1", 00:28:44.645 "base_bdev": "Nvme0n1", 00:28:44.645 "thin_provision": true, 00:28:44.645 "num_allocated_clusters": 0, 00:28:44.645 "snapshot": false, 00:28:44.645 "clone": false, 00:28:44.645 "esnap_clone": false 00:28:44.645 } 00:28:44.645 } 00:28:44.645 } 00:28:44.645 ] 00:28:44.645 16:07:50 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:44.646 16:07:50 compress_compdev -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:28:44.646 16:07:50 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:28:44.904 [2024-06-10 16:07:50.250675] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:44.904 COMP_lvs0/lv0 00:28:44.904 16:07:50 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:44.904 16:07:50 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:45.164 16:07:50 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:45.422 [ 00:28:45.422 { 00:28:45.422 "name": "COMP_lvs0/lv0", 00:28:45.422 "aliases": [ 00:28:45.422 "0ccfa9b8-67e3-596e-9e7f-8ce6b9b7b7a7" 00:28:45.422 ], 00:28:45.422 "product_name": "compress", 00:28:45.422 "block_size": 512, 00:28:45.422 "num_blocks": 200704, 00:28:45.422 "uuid": "0ccfa9b8-67e3-596e-9e7f-8ce6b9b7b7a7", 00:28:45.422 "assigned_rate_limits": { 00:28:45.422 "rw_ios_per_sec": 0, 00:28:45.422 "rw_mbytes_per_sec": 0, 00:28:45.422 "r_mbytes_per_sec": 0, 00:28:45.422 "w_mbytes_per_sec": 0 00:28:45.422 }, 00:28:45.422 "claimed": false, 00:28:45.422 "zoned": false, 00:28:45.422 "supported_io_types": { 00:28:45.422 "read": true, 00:28:45.422 "write": true, 00:28:45.422 "unmap": false, 00:28:45.422 "write_zeroes": true, 00:28:45.422 "flush": false, 00:28:45.422 "reset": false, 00:28:45.422 "compare": false, 00:28:45.422 "compare_and_write": false, 00:28:45.422 "abort": false, 00:28:45.422 "nvme_admin": false, 00:28:45.422 "nvme_io": false 00:28:45.422 }, 00:28:45.422 "driver_specific": { 00:28:45.422 "compress": { 00:28:45.423 "name": "COMP_lvs0/lv0", 00:28:45.423 "base_bdev_name": "6dd6ba39-9eab-4c95-b3e2-f544a3a57a8f" 00:28:45.423 } 00:28:45.423 } 00:28:45.423 } 00:28:45.423 ] 00:28:45.423 16:07:50 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:45.423 16:07:50 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:45.423 [2024-06-10 16:07:50.892861] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f25641b15c0 PMD being used: compress_qat 00:28:45.423 [2024-06-10 16:07:50.894893] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0xf26d40 PMD being used: compress_qat 00:28:45.423 Running I/O for 3 seconds... 00:28:48.712 00:28:48.712 Latency(us) 00:28:48.712 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:48.712 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:28:48.712 Verification LBA range: start 0x0 length 0x3100 00:28:48.712 COMP_lvs0/lv0 : 3.01 4008.21 15.66 0.00 0.00 7919.95 128.73 13107.20 00:28:48.712 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:28:48.712 Verification LBA range: start 0x3100 length 0x3100 00:28:48.712 COMP_lvs0/lv0 : 3.01 4099.13 16.01 0.00 0.00 7763.37 120.44 12857.54 00:28:48.712 =================================================================================================================== 00:28:48.712 Total : 8107.34 31.67 0.00 0.00 7840.78 120.44 13107.20 00:28:48.712 0 00:28:48.712 16:07:53 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:28:48.712 16:07:53 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:28:48.712 16:07:54 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:28:48.972 16:07:54 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:28:48.972 16:07:54 compress_compdev -- compress/compress.sh@78 -- # killprocess 2839369 00:28:48.972 16:07:54 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 2839369 ']' 00:28:48.972 16:07:54 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 2839369 00:28:48.972 16:07:54 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:28:48.972 16:07:54 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:28:48.972 16:07:54 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2839369 00:28:49.231 16:07:54 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:28:49.231 16:07:54 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:28:49.231 16:07:54 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2839369' 00:28:49.231 killing process with pid 2839369 00:28:49.231 16:07:54 compress_compdev -- common/autotest_common.sh@968 -- # kill 2839369 00:28:49.231 Received shutdown signal, test time was about 3.000000 seconds 00:28:49.231 00:28:49.231 Latency(us) 00:28:49.231 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:28:49.231 =================================================================================================================== 00:28:49.231 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:28:49.231 16:07:54 compress_compdev -- common/autotest_common.sh@973 -- # wait 2839369 00:28:50.608 16:07:55 compress_compdev -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:28:50.608 16:07:55 compress_compdev -- compress/compress.sh@66 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:28:50.608 16:07:55 compress_compdev -- compress/compress.sh@71 -- # bdevperf_pid=2841421 00:28:50.608 16:07:55 compress_compdev -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:28:50.609 16:07:55 compress_compdev -- compress/compress.sh@67 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json 00:28:50.609 16:07:55 compress_compdev -- compress/compress.sh@73 -- # waitforlisten 2841421 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 2841421 ']' 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:28:50.609 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:28:50.609 16:07:55 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:28:50.609 [2024-06-10 16:07:56.018723] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:28:50.609 [2024-06-10 16:07:56.018782] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2841421 ] 00:28:50.609 [2024-06-10 16:07:56.111939] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:28:50.866 [2024-06-10 16:07:56.207627] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:28:50.866 [2024-06-10 16:07:56.207633] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:28:51.432 [2024-06-10 16:07:56.755970] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:28:51.691 16:07:56 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:28:51.691 16:07:56 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:28:51.691 16:07:56 compress_compdev -- compress/compress.sh@74 -- # create_vols 4096 00:28:51.691 16:07:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:28:51.691 16:07:56 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:28:54.976 [2024-06-10 16:08:00.078890] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x270ee80 PMD being used: compress_qat 00:28:54.976 16:08:00 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:54.976 16:08:00 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:28:55.234 [ 00:28:55.234 { 00:28:55.234 "name": "Nvme0n1", 00:28:55.234 "aliases": [ 00:28:55.234 "b9064dda-915d-46d4-82a7-48ab6a029987" 00:28:55.234 ], 00:28:55.234 "product_name": "NVMe disk", 00:28:55.234 "block_size": 512, 00:28:55.234 "num_blocks": 1953525168, 00:28:55.234 "uuid": "b9064dda-915d-46d4-82a7-48ab6a029987", 00:28:55.234 "assigned_rate_limits": { 00:28:55.234 "rw_ios_per_sec": 0, 00:28:55.234 "rw_mbytes_per_sec": 0, 00:28:55.234 "r_mbytes_per_sec": 0, 00:28:55.234 "w_mbytes_per_sec": 0 00:28:55.234 }, 00:28:55.234 "claimed": false, 00:28:55.234 "zoned": false, 00:28:55.234 "supported_io_types": { 00:28:55.234 "read": true, 00:28:55.234 "write": true, 00:28:55.234 "unmap": true, 00:28:55.234 "write_zeroes": true, 00:28:55.234 "flush": true, 00:28:55.234 "reset": true, 00:28:55.234 "compare": false, 00:28:55.234 "compare_and_write": false, 00:28:55.234 "abort": true, 00:28:55.234 "nvme_admin": true, 00:28:55.234 "nvme_io": true 00:28:55.234 }, 00:28:55.234 "driver_specific": { 00:28:55.234 "nvme": [ 00:28:55.234 { 00:28:55.234 "pci_address": "0000:5e:00.0", 00:28:55.234 "trid": { 00:28:55.234 "trtype": "PCIe", 00:28:55.234 "traddr": "0000:5e:00.0" 00:28:55.234 }, 00:28:55.234 "ctrlr_data": { 00:28:55.234 "cntlid": 0, 00:28:55.234 "vendor_id": "0x8086", 00:28:55.234 "model_number": "INTEL SSDPE2KX010T8", 00:28:55.234 "serial_number": "BTLJ807001JM1P0FGN", 00:28:55.234 "firmware_revision": "VDV10170", 00:28:55.234 "oacs": { 00:28:55.234 "security": 1, 00:28:55.235 "format": 1, 00:28:55.235 "firmware": 1, 00:28:55.235 "ns_manage": 1 00:28:55.235 }, 00:28:55.235 "multi_ctrlr": false, 00:28:55.235 "ana_reporting": false 00:28:55.235 }, 00:28:55.235 "vs": { 00:28:55.235 "nvme_version": "1.2" 00:28:55.235 }, 00:28:55.235 "ns_data": { 00:28:55.235 "id": 1, 00:28:55.235 "can_share": false 00:28:55.235 }, 00:28:55.235 "security": { 00:28:55.235 "opal": true 00:28:55.235 } 00:28:55.235 } 00:28:55.235 ], 00:28:55.235 "mp_policy": "active_passive" 00:28:55.235 } 00:28:55.235 } 00:28:55.235 ] 00:28:55.235 16:08:00 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:55.235 16:08:00 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:28:55.493 [2024-06-10 16:08:00.876505] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x270fdb0 PMD being used: compress_qat 00:28:56.427 9214e72e-58ce-46fb-a1ae-ade09e2e2b38 00:28:56.427 16:08:01 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:28:56.685 d4ea0b7d-b0c4-4f2c-ae39-fc41998e7056 00:28:56.685 16:08:02 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:56.685 16:08:02 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:56.943 16:08:02 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:28:57.201 [ 00:28:57.201 { 00:28:57.201 "name": "d4ea0b7d-b0c4-4f2c-ae39-fc41998e7056", 00:28:57.201 "aliases": [ 00:28:57.201 "lvs0/lv0" 00:28:57.201 ], 00:28:57.201 "product_name": "Logical Volume", 00:28:57.201 "block_size": 512, 00:28:57.201 "num_blocks": 204800, 00:28:57.201 "uuid": "d4ea0b7d-b0c4-4f2c-ae39-fc41998e7056", 00:28:57.201 "assigned_rate_limits": { 00:28:57.201 "rw_ios_per_sec": 0, 00:28:57.201 "rw_mbytes_per_sec": 0, 00:28:57.201 "r_mbytes_per_sec": 0, 00:28:57.201 "w_mbytes_per_sec": 0 00:28:57.201 }, 00:28:57.201 "claimed": false, 00:28:57.201 "zoned": false, 00:28:57.201 "supported_io_types": { 00:28:57.201 "read": true, 00:28:57.201 "write": true, 00:28:57.201 "unmap": true, 00:28:57.202 "write_zeroes": true, 00:28:57.202 "flush": false, 00:28:57.202 "reset": true, 00:28:57.202 "compare": false, 00:28:57.202 "compare_and_write": false, 00:28:57.202 "abort": false, 00:28:57.202 "nvme_admin": false, 00:28:57.202 "nvme_io": false 00:28:57.202 }, 00:28:57.202 "driver_specific": { 00:28:57.202 "lvol": { 00:28:57.202 "lvol_store_uuid": "9214e72e-58ce-46fb-a1ae-ade09e2e2b38", 00:28:57.202 "base_bdev": "Nvme0n1", 00:28:57.202 "thin_provision": true, 00:28:57.202 "num_allocated_clusters": 0, 00:28:57.202 "snapshot": false, 00:28:57.202 "clone": false, 00:28:57.202 "esnap_clone": false 00:28:57.202 } 00:28:57.202 } 00:28:57.202 } 00:28:57.202 ] 00:28:57.202 16:08:02 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:57.202 16:08:02 compress_compdev -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:28:57.202 16:08:02 compress_compdev -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:28:57.460 [2024-06-10 16:08:02.779715] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:28:57.460 COMP_lvs0/lv0 00:28:57.460 16:08:02 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:28:57.460 16:08:02 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:28:57.717 16:08:03 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:28:57.975 [ 00:28:57.975 { 00:28:57.975 "name": "COMP_lvs0/lv0", 00:28:57.975 "aliases": [ 00:28:57.975 "e3278581-6b92-5f53-8265-993ca6f3e8f0" 00:28:57.975 ], 00:28:57.975 "product_name": "compress", 00:28:57.975 "block_size": 4096, 00:28:57.975 "num_blocks": 25088, 00:28:57.975 "uuid": "e3278581-6b92-5f53-8265-993ca6f3e8f0", 00:28:57.975 "assigned_rate_limits": { 00:28:57.975 "rw_ios_per_sec": 0, 00:28:57.975 "rw_mbytes_per_sec": 0, 00:28:57.975 "r_mbytes_per_sec": 0, 00:28:57.975 "w_mbytes_per_sec": 0 00:28:57.975 }, 00:28:57.975 "claimed": false, 00:28:57.975 "zoned": false, 00:28:57.975 "supported_io_types": { 00:28:57.975 "read": true, 00:28:57.975 "write": true, 00:28:57.975 "unmap": false, 00:28:57.975 "write_zeroes": true, 00:28:57.975 "flush": false, 00:28:57.975 "reset": false, 00:28:57.975 "compare": false, 00:28:57.975 "compare_and_write": false, 00:28:57.975 "abort": false, 00:28:57.975 "nvme_admin": false, 00:28:57.975 "nvme_io": false 00:28:57.975 }, 00:28:57.975 "driver_specific": { 00:28:57.975 "compress": { 00:28:57.975 "name": "COMP_lvs0/lv0", 00:28:57.975 "base_bdev_name": "d4ea0b7d-b0c4-4f2c-ae39-fc41998e7056" 00:28:57.975 } 00:28:57.975 } 00:28:57.975 } 00:28:57.975 ] 00:28:57.975 16:08:03 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:28:57.975 16:08:03 compress_compdev -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:28:57.975 [2024-06-10 16:08:03.450073] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7eff2c1b15c0 PMD being used: compress_qat 00:28:57.975 [2024-06-10 16:08:03.452073] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x270ad40 PMD being used: compress_qat 00:28:57.975 Running I/O for 3 seconds... 00:29:01.260 00:29:01.260 Latency(us) 00:29:01.260 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:01.260 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:01.260 Verification LBA range: start 0x0 length 0x3100 00:29:01.260 COMP_lvs0/lv0 : 3.01 3883.24 15.17 0.00 0.00 8180.61 176.52 13169.62 00:29:01.260 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:01.260 Verification LBA range: start 0x3100 length 0x3100 00:29:01.260 COMP_lvs0/lv0 : 3.01 3959.56 15.47 0.00 0.00 8042.29 164.82 14105.84 00:29:01.260 =================================================================================================================== 00:29:01.260 Total : 7842.80 30.64 0.00 0.00 8110.79 164.82 14105.84 00:29:01.260 0 00:29:01.260 16:08:06 compress_compdev -- compress/compress.sh@76 -- # destroy_vols 00:29:01.260 16:08:06 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:01.260 16:08:06 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:01.518 16:08:07 compress_compdev -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:01.518 16:08:07 compress_compdev -- compress/compress.sh@78 -- # killprocess 2841421 00:29:01.518 16:08:07 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 2841421 ']' 00:29:01.518 16:08:07 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 2841421 00:29:01.518 16:08:07 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2841421 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2841421' 00:29:01.776 killing process with pid 2841421 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@968 -- # kill 2841421 00:29:01.776 Received shutdown signal, test time was about 3.000000 seconds 00:29:01.776 00:29:01.776 Latency(us) 00:29:01.776 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:01.776 =================================================================================================================== 00:29:01.776 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:01.776 16:08:07 compress_compdev -- common/autotest_common.sh@973 -- # wait 2841421 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@89 -- # run_bdevio 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@50 -- # [[ compdev == \c\o\m\p\d\e\v ]] 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@55 -- # bdevio_pid=2843419 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@51 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -c /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/dpdk.json -w 00:29:03.151 16:08:08 compress_compdev -- compress/compress.sh@57 -- # waitforlisten 2843419 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@830 -- # '[' -z 2843419 ']' 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:03.151 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:03.151 16:08:08 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:03.151 [2024-06-10 16:08:08.599168] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:29:03.151 [2024-06-10 16:08:08.599228] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2843419 ] 00:29:03.409 [2024-06-10 16:08:08.699778] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:03.410 [2024-06-10 16:08:08.797754] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:03.410 [2024-06-10 16:08:08.797848] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:03.410 [2024-06-10 16:08:08.797852] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:03.977 [2024-06-10 16:08:09.342341] accel_dpdk_compressdev.c: 296:accel_init_compress_drivers: *NOTICE*: initialized QAT PMD 00:29:04.235 16:08:09 compress_compdev -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:04.235 16:08:09 compress_compdev -- common/autotest_common.sh@863 -- # return 0 00:29:04.235 16:08:09 compress_compdev -- compress/compress.sh@58 -- # create_vols 00:29:04.235 16:08:09 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:04.235 16:08:09 compress_compdev -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:07.521 [2024-06-10 16:08:12.647638] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x1364900 PMD being used: compress_qat 00:29:07.521 16:08:12 compress_compdev -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:07.521 16:08:12 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:07.779 [ 00:29:07.779 { 00:29:07.779 "name": "Nvme0n1", 00:29:07.779 "aliases": [ 00:29:07.779 "a245f46a-722d-448b-9146-e5a90be3b6c4" 00:29:07.779 ], 00:29:07.779 "product_name": "NVMe disk", 00:29:07.779 "block_size": 512, 00:29:07.779 "num_blocks": 1953525168, 00:29:07.779 "uuid": "a245f46a-722d-448b-9146-e5a90be3b6c4", 00:29:07.779 "assigned_rate_limits": { 00:29:07.779 "rw_ios_per_sec": 0, 00:29:07.779 "rw_mbytes_per_sec": 0, 00:29:07.779 "r_mbytes_per_sec": 0, 00:29:07.779 "w_mbytes_per_sec": 0 00:29:07.779 }, 00:29:07.779 "claimed": false, 00:29:07.779 "zoned": false, 00:29:07.779 "supported_io_types": { 00:29:07.779 "read": true, 00:29:07.779 "write": true, 00:29:07.779 "unmap": true, 00:29:07.779 "write_zeroes": true, 00:29:07.779 "flush": true, 00:29:07.779 "reset": true, 00:29:07.779 "compare": false, 00:29:07.779 "compare_and_write": false, 00:29:07.779 "abort": true, 00:29:07.779 "nvme_admin": true, 00:29:07.779 "nvme_io": true 00:29:07.779 }, 00:29:07.779 "driver_specific": { 00:29:07.779 "nvme": [ 00:29:07.780 { 00:29:07.780 "pci_address": "0000:5e:00.0", 00:29:07.780 "trid": { 00:29:07.780 "trtype": "PCIe", 00:29:07.780 "traddr": "0000:5e:00.0" 00:29:07.780 }, 00:29:07.780 "ctrlr_data": { 00:29:07.780 "cntlid": 0, 00:29:07.780 "vendor_id": "0x8086", 00:29:07.780 "model_number": "INTEL SSDPE2KX010T8", 00:29:07.780 "serial_number": "BTLJ807001JM1P0FGN", 00:29:07.780 "firmware_revision": "VDV10170", 00:29:07.780 "oacs": { 00:29:07.780 "security": 1, 00:29:07.780 "format": 1, 00:29:07.780 "firmware": 1, 00:29:07.780 "ns_manage": 1 00:29:07.780 }, 00:29:07.780 "multi_ctrlr": false, 00:29:07.780 "ana_reporting": false 00:29:07.780 }, 00:29:07.780 "vs": { 00:29:07.780 "nvme_version": "1.2" 00:29:07.780 }, 00:29:07.780 "ns_data": { 00:29:07.780 "id": 1, 00:29:07.780 "can_share": false 00:29:07.780 }, 00:29:07.780 "security": { 00:29:07.780 "opal": true 00:29:07.780 } 00:29:07.780 } 00:29:07.780 ], 00:29:07.780 "mp_policy": "active_passive" 00:29:07.780 } 00:29:07.780 } 00:29:07.780 ] 00:29:07.780 16:08:13 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:29:07.780 16:08:13 compress_compdev -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:08.038 [2024-06-10 16:08:13.449186] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x11c9ba0 PMD being used: compress_qat 00:29:08.973 d0ae5695-74e8-47b2-993d-ef3a626a8808 00:29:08.973 16:08:14 compress_compdev -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:09.233 a86c91c4-b733-405b-a7e2-17bc0e79f798 00:29:09.233 16:08:14 compress_compdev -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:09.233 16:08:14 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:09.540 16:08:14 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:09.799 [ 00:29:09.799 { 00:29:09.799 "name": "a86c91c4-b733-405b-a7e2-17bc0e79f798", 00:29:09.799 "aliases": [ 00:29:09.799 "lvs0/lv0" 00:29:09.799 ], 00:29:09.799 "product_name": "Logical Volume", 00:29:09.799 "block_size": 512, 00:29:09.799 "num_blocks": 204800, 00:29:09.799 "uuid": "a86c91c4-b733-405b-a7e2-17bc0e79f798", 00:29:09.799 "assigned_rate_limits": { 00:29:09.799 "rw_ios_per_sec": 0, 00:29:09.799 "rw_mbytes_per_sec": 0, 00:29:09.799 "r_mbytes_per_sec": 0, 00:29:09.799 "w_mbytes_per_sec": 0 00:29:09.799 }, 00:29:09.799 "claimed": false, 00:29:09.799 "zoned": false, 00:29:09.799 "supported_io_types": { 00:29:09.799 "read": true, 00:29:09.799 "write": true, 00:29:09.799 "unmap": true, 00:29:09.799 "write_zeroes": true, 00:29:09.799 "flush": false, 00:29:09.799 "reset": true, 00:29:09.799 "compare": false, 00:29:09.799 "compare_and_write": false, 00:29:09.799 "abort": false, 00:29:09.799 "nvme_admin": false, 00:29:09.799 "nvme_io": false 00:29:09.799 }, 00:29:09.799 "driver_specific": { 00:29:09.799 "lvol": { 00:29:09.799 "lvol_store_uuid": "d0ae5695-74e8-47b2-993d-ef3a626a8808", 00:29:09.799 "base_bdev": "Nvme0n1", 00:29:09.799 "thin_provision": true, 00:29:09.799 "num_allocated_clusters": 0, 00:29:09.799 "snapshot": false, 00:29:09.799 "clone": false, 00:29:09.799 "esnap_clone": false 00:29:09.799 } 00:29:09.799 } 00:29:09.799 } 00:29:09.799 ] 00:29:09.799 16:08:15 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:29:09.799 16:08:15 compress_compdev -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:09.799 16:08:15 compress_compdev -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:10.058 [2024-06-10 16:08:15.372093] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:10.058 COMP_lvs0/lv0 00:29:10.058 16:08:15 compress_compdev -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@900 -- # local i 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:10.058 16:08:15 compress_compdev -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:10.316 16:08:15 compress_compdev -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:10.575 [ 00:29:10.575 { 00:29:10.575 "name": "COMP_lvs0/lv0", 00:29:10.575 "aliases": [ 00:29:10.575 "309c029d-6a30-567b-b9c1-6e3689034342" 00:29:10.575 ], 00:29:10.575 "product_name": "compress", 00:29:10.575 "block_size": 512, 00:29:10.575 "num_blocks": 200704, 00:29:10.575 "uuid": "309c029d-6a30-567b-b9c1-6e3689034342", 00:29:10.575 "assigned_rate_limits": { 00:29:10.575 "rw_ios_per_sec": 0, 00:29:10.575 "rw_mbytes_per_sec": 0, 00:29:10.575 "r_mbytes_per_sec": 0, 00:29:10.575 "w_mbytes_per_sec": 0 00:29:10.575 }, 00:29:10.575 "claimed": false, 00:29:10.575 "zoned": false, 00:29:10.575 "supported_io_types": { 00:29:10.575 "read": true, 00:29:10.575 "write": true, 00:29:10.575 "unmap": false, 00:29:10.575 "write_zeroes": true, 00:29:10.575 "flush": false, 00:29:10.575 "reset": false, 00:29:10.575 "compare": false, 00:29:10.575 "compare_and_write": false, 00:29:10.575 "abort": false, 00:29:10.575 "nvme_admin": false, 00:29:10.575 "nvme_io": false 00:29:10.575 }, 00:29:10.575 "driver_specific": { 00:29:10.575 "compress": { 00:29:10.575 "name": "COMP_lvs0/lv0", 00:29:10.575 "base_bdev_name": "a86c91c4-b733-405b-a7e2-17bc0e79f798" 00:29:10.575 } 00:29:10.575 } 00:29:10.575 } 00:29:10.575 ] 00:29:10.575 16:08:15 compress_compdev -- common/autotest_common.sh@906 -- # return 0 00:29:10.575 16:08:15 compress_compdev -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:10.575 [2024-06-10 16:08:16.041394] accel_dpdk_compressdev.c: 690:_set_pmd: *NOTICE*: Channel 0x7f67281b1350 PMD being used: compress_qat 00:29:10.575 I/O targets: 00:29:10.575 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:29:10.575 00:29:10.575 00:29:10.575 CUnit - A unit testing framework for C - Version 2.1-3 00:29:10.575 http://cunit.sourceforge.net/ 00:29:10.575 00:29:10.575 00:29:10.575 Suite: bdevio tests on: COMP_lvs0/lv0 00:29:10.575 Test: blockdev write read block ...passed 00:29:10.575 Test: blockdev write zeroes read block ...passed 00:29:10.575 Test: blockdev write zeroes read no split ...passed 00:29:10.575 Test: blockdev write zeroes read split ...passed 00:29:10.834 Test: blockdev write zeroes read split partial ...passed 00:29:10.834 Test: blockdev reset ...[2024-06-10 16:08:16.094723] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:29:10.834 passed 00:29:10.834 Test: blockdev write read 8 blocks ...passed 00:29:10.834 Test: blockdev write read size > 128k ...passed 00:29:10.834 Test: blockdev write read invalid size ...passed 00:29:10.834 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:10.834 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:10.834 Test: blockdev write read max offset ...passed 00:29:10.834 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:10.834 Test: blockdev writev readv 8 blocks ...passed 00:29:10.834 Test: blockdev writev readv 30 x 1block ...passed 00:29:10.834 Test: blockdev writev readv block ...passed 00:29:10.834 Test: blockdev writev readv size > 128k ...passed 00:29:10.834 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:10.834 Test: blockdev comparev and writev ...passed 00:29:10.834 Test: blockdev nvme passthru rw ...passed 00:29:10.834 Test: blockdev nvme passthru vendor specific ...passed 00:29:10.834 Test: blockdev nvme admin passthru ...passed 00:29:10.834 Test: blockdev copy ...passed 00:29:10.834 00:29:10.834 Run Summary: Type Total Ran Passed Failed Inactive 00:29:10.834 suites 1 1 n/a 0 0 00:29:10.834 tests 23 23 23 0 0 00:29:10.834 asserts 130 130 130 0 n/a 00:29:10.834 00:29:10.834 Elapsed time = 0.159 seconds 00:29:10.834 0 00:29:10.834 16:08:16 compress_compdev -- compress/compress.sh@60 -- # destroy_vols 00:29:10.834 16:08:16 compress_compdev -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:11.092 16:08:16 compress_compdev -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:11.351 16:08:16 compress_compdev -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:29:11.351 16:08:16 compress_compdev -- compress/compress.sh@62 -- # killprocess 2843419 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@949 -- # '[' -z 2843419 ']' 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@953 -- # kill -0 2843419 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@954 -- # uname 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2843419 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2843419' 00:29:11.351 killing process with pid 2843419 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@968 -- # kill 2843419 00:29:11.351 16:08:16 compress_compdev -- common/autotest_common.sh@973 -- # wait 2843419 00:29:12.727 16:08:18 compress_compdev -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:29:12.727 16:08:18 compress_compdev -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:29:12.727 00:29:12.727 real 0m46.811s 00:29:12.727 user 1m48.380s 00:29:12.727 sys 0m4.289s 00:29:12.727 16:08:18 compress_compdev -- common/autotest_common.sh@1125 -- # xtrace_disable 00:29:12.727 16:08:18 compress_compdev -- common/autotest_common.sh@10 -- # set +x 00:29:12.727 ************************************ 00:29:12.727 END TEST compress_compdev 00:29:12.727 ************************************ 00:29:12.986 16:08:18 -- spdk/autotest.sh@349 -- # run_test compress_isal /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:12.986 16:08:18 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:29:12.986 16:08:18 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:29:12.986 16:08:18 -- common/autotest_common.sh@10 -- # set +x 00:29:12.986 ************************************ 00:29:12.986 START TEST compress_isal 00:29:12.986 ************************************ 00:29:12.986 16:08:18 compress_isal -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress/compress.sh isal 00:29:12.986 * Looking for test storage... 00:29:12.986 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/compress 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@13 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@7 -- # uname -s 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:29:12.986 16:08:18 compress_isal -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:29:12.986 16:08:18 compress_isal -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:29:12.986 16:08:18 compress_isal -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:29:12.986 16:08:18 compress_isal -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.986 16:08:18 compress_isal -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.986 16:08:18 compress_isal -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.986 16:08:18 compress_isal -- paths/export.sh@5 -- # export PATH 00:29:12.986 16:08:18 compress_isal -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@47 -- # : 0 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:29:12.986 16:08:18 compress_isal -- nvmf/common.sh@51 -- # have_pci_nics=0 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@17 -- # rpc_py=/var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@81 -- # mkdir -p /tmp/pmem 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@82 -- # test_type=isal 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@86 -- # run_bdevperf 32 4096 3 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2845031 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2845031 00:29:12.986 16:08:18 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 2845031 ']' 00:29:12.986 16:08:18 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:12.986 16:08:18 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:12.986 16:08:18 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:12.986 16:08:18 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:12.986 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:12.987 16:08:18 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:12.987 16:08:18 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:12.987 [2024-06-10 16:08:18.450388] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:29:12.987 [2024-06-10 16:08:18.450451] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2845031 ] 00:29:13.246 [2024-06-10 16:08:18.542772] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:13.246 [2024-06-10 16:08:18.638497] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:13.246 [2024-06-10 16:08:18.638503] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:14.181 16:08:19 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:14.181 16:08:19 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:29:14.181 16:08:19 compress_isal -- compress/compress.sh@74 -- # create_vols 00:29:14.181 16:08:19 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:14.181 16:08:19 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:17.466 16:08:22 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:17.466 16:08:22 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:17.466 [ 00:29:17.466 { 00:29:17.466 "name": "Nvme0n1", 00:29:17.466 "aliases": [ 00:29:17.466 "1fb5319e-fdf7-4fd8-ab0b-6d2c8f255626" 00:29:17.466 ], 00:29:17.466 "product_name": "NVMe disk", 00:29:17.466 "block_size": 512, 00:29:17.466 "num_blocks": 1953525168, 00:29:17.466 "uuid": "1fb5319e-fdf7-4fd8-ab0b-6d2c8f255626", 00:29:17.466 "assigned_rate_limits": { 00:29:17.466 "rw_ios_per_sec": 0, 00:29:17.466 "rw_mbytes_per_sec": 0, 00:29:17.466 "r_mbytes_per_sec": 0, 00:29:17.466 "w_mbytes_per_sec": 0 00:29:17.466 }, 00:29:17.466 "claimed": false, 00:29:17.466 "zoned": false, 00:29:17.466 "supported_io_types": { 00:29:17.466 "read": true, 00:29:17.466 "write": true, 00:29:17.466 "unmap": true, 00:29:17.466 "write_zeroes": true, 00:29:17.466 "flush": true, 00:29:17.466 "reset": true, 00:29:17.466 "compare": false, 00:29:17.466 "compare_and_write": false, 00:29:17.466 "abort": true, 00:29:17.466 "nvme_admin": true, 00:29:17.466 "nvme_io": true 00:29:17.466 }, 00:29:17.466 "driver_specific": { 00:29:17.466 "nvme": [ 00:29:17.466 { 00:29:17.466 "pci_address": "0000:5e:00.0", 00:29:17.466 "trid": { 00:29:17.466 "trtype": "PCIe", 00:29:17.466 "traddr": "0000:5e:00.0" 00:29:17.466 }, 00:29:17.466 "ctrlr_data": { 00:29:17.466 "cntlid": 0, 00:29:17.466 "vendor_id": "0x8086", 00:29:17.466 "model_number": "INTEL SSDPE2KX010T8", 00:29:17.466 "serial_number": "BTLJ807001JM1P0FGN", 00:29:17.467 "firmware_revision": "VDV10170", 00:29:17.467 "oacs": { 00:29:17.467 "security": 1, 00:29:17.467 "format": 1, 00:29:17.467 "firmware": 1, 00:29:17.467 "ns_manage": 1 00:29:17.467 }, 00:29:17.467 "multi_ctrlr": false, 00:29:17.467 "ana_reporting": false 00:29:17.467 }, 00:29:17.467 "vs": { 00:29:17.467 "nvme_version": "1.2" 00:29:17.467 }, 00:29:17.467 "ns_data": { 00:29:17.467 "id": 1, 00:29:17.467 "can_share": false 00:29:17.467 }, 00:29:17.467 "security": { 00:29:17.467 "opal": true 00:29:17.467 } 00:29:17.467 } 00:29:17.467 ], 00:29:17.467 "mp_policy": "active_passive" 00:29:17.467 } 00:29:17.467 } 00:29:17.467 ] 00:29:17.467 16:08:22 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:17.467 16:08:22 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:18.843 36016b02-dbdb-4ce0-85c3-45bac2935508 00:29:18.843 16:08:24 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:18.843 3a6816e0-0194-4420-92b0-0d726d327253 00:29:18.843 16:08:24 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:18.843 16:08:24 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:19.101 16:08:24 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:19.359 [ 00:29:19.359 { 00:29:19.359 "name": "3a6816e0-0194-4420-92b0-0d726d327253", 00:29:19.359 "aliases": [ 00:29:19.359 "lvs0/lv0" 00:29:19.359 ], 00:29:19.359 "product_name": "Logical Volume", 00:29:19.359 "block_size": 512, 00:29:19.359 "num_blocks": 204800, 00:29:19.359 "uuid": "3a6816e0-0194-4420-92b0-0d726d327253", 00:29:19.359 "assigned_rate_limits": { 00:29:19.359 "rw_ios_per_sec": 0, 00:29:19.359 "rw_mbytes_per_sec": 0, 00:29:19.359 "r_mbytes_per_sec": 0, 00:29:19.359 "w_mbytes_per_sec": 0 00:29:19.359 }, 00:29:19.359 "claimed": false, 00:29:19.359 "zoned": false, 00:29:19.359 "supported_io_types": { 00:29:19.359 "read": true, 00:29:19.359 "write": true, 00:29:19.359 "unmap": true, 00:29:19.359 "write_zeroes": true, 00:29:19.359 "flush": false, 00:29:19.359 "reset": true, 00:29:19.359 "compare": false, 00:29:19.359 "compare_and_write": false, 00:29:19.359 "abort": false, 00:29:19.359 "nvme_admin": false, 00:29:19.359 "nvme_io": false 00:29:19.359 }, 00:29:19.359 "driver_specific": { 00:29:19.359 "lvol": { 00:29:19.359 "lvol_store_uuid": "36016b02-dbdb-4ce0-85c3-45bac2935508", 00:29:19.359 "base_bdev": "Nvme0n1", 00:29:19.359 "thin_provision": true, 00:29:19.359 "num_allocated_clusters": 0, 00:29:19.359 "snapshot": false, 00:29:19.359 "clone": false, 00:29:19.359 "esnap_clone": false 00:29:19.359 } 00:29:19.359 } 00:29:19.359 } 00:29:19.359 ] 00:29:19.359 16:08:24 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:19.359 16:08:24 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:19.359 16:08:24 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:19.618 [2024-06-10 16:08:25.083776] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:19.618 COMP_lvs0/lv0 00:29:19.618 16:08:25 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:19.618 16:08:25 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:19.876 16:08:25 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:20.134 [ 00:29:20.134 { 00:29:20.134 "name": "COMP_lvs0/lv0", 00:29:20.134 "aliases": [ 00:29:20.134 "f397b742-58b2-5197-9a9f-7e8e50f9e79d" 00:29:20.134 ], 00:29:20.134 "product_name": "compress", 00:29:20.135 "block_size": 512, 00:29:20.135 "num_blocks": 200704, 00:29:20.135 "uuid": "f397b742-58b2-5197-9a9f-7e8e50f9e79d", 00:29:20.135 "assigned_rate_limits": { 00:29:20.135 "rw_ios_per_sec": 0, 00:29:20.135 "rw_mbytes_per_sec": 0, 00:29:20.135 "r_mbytes_per_sec": 0, 00:29:20.135 "w_mbytes_per_sec": 0 00:29:20.135 }, 00:29:20.135 "claimed": false, 00:29:20.135 "zoned": false, 00:29:20.135 "supported_io_types": { 00:29:20.135 "read": true, 00:29:20.135 "write": true, 00:29:20.135 "unmap": false, 00:29:20.135 "write_zeroes": true, 00:29:20.135 "flush": false, 00:29:20.135 "reset": false, 00:29:20.135 "compare": false, 00:29:20.135 "compare_and_write": false, 00:29:20.135 "abort": false, 00:29:20.135 "nvme_admin": false, 00:29:20.135 "nvme_io": false 00:29:20.135 }, 00:29:20.135 "driver_specific": { 00:29:20.135 "compress": { 00:29:20.135 "name": "COMP_lvs0/lv0", 00:29:20.135 "base_bdev_name": "3a6816e0-0194-4420-92b0-0d726d327253" 00:29:20.135 } 00:29:20.135 } 00:29:20.135 } 00:29:20.135 ] 00:29:20.135 16:08:25 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:20.135 16:08:25 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:20.393 Running I/O for 3 seconds... 00:29:23.678 00:29:23.678 Latency(us) 00:29:23.678 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:23.678 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:23.678 Verification LBA range: start 0x0 length 0x3100 00:29:23.678 COMP_lvs0/lv0 : 3.01 3295.57 12.87 0.00 0.00 9647.02 56.56 15541.39 00:29:23.678 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:23.678 Verification LBA range: start 0x3100 length 0x3100 00:29:23.678 COMP_lvs0/lv0 : 3.01 3282.22 12.82 0.00 0.00 9704.92 54.86 14730.00 00:29:23.678 =================================================================================================================== 00:29:23.678 Total : 6577.79 25.69 0.00 0.00 9675.91 54.86 15541.39 00:29:23.678 0 00:29:23.678 16:08:28 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:23.678 16:08:28 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:23.678 16:08:28 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:23.937 16:08:29 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:23.937 16:08:29 compress_isal -- compress/compress.sh@78 -- # killprocess 2845031 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 2845031 ']' 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@953 -- # kill -0 2845031 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@954 -- # uname 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2845031 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2845031' 00:29:23.937 killing process with pid 2845031 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@968 -- # kill 2845031 00:29:23.937 Received shutdown signal, test time was about 3.000000 seconds 00:29:23.937 00:29:23.937 Latency(us) 00:29:23.937 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:23.937 =================================================================================================================== 00:29:23.937 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:23.937 16:08:29 compress_isal -- common/autotest_common.sh@973 -- # wait 2845031 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@87 -- # run_bdevperf 32 4096 3 512 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2846987 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2846987 00:29:25.408 16:08:30 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 2846987 ']' 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:25.408 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:25.408 16:08:30 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:25.408 [2024-06-10 16:08:30.788542] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:29:25.408 [2024-06-10 16:08:30.788603] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2846987 ] 00:29:25.408 [2024-06-10 16:08:30.881408] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:25.666 [2024-06-10 16:08:30.977533] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:25.666 [2024-06-10 16:08:30.977540] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:26.602 16:08:31 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:26.602 16:08:31 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:29:26.602 16:08:31 compress_isal -- compress/compress.sh@74 -- # create_vols 512 00:29:26.602 16:08:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:26.602 16:08:31 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:29.887 16:08:34 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:29.887 16:08:34 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:29.887 16:08:35 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:29.887 [ 00:29:29.887 { 00:29:29.887 "name": "Nvme0n1", 00:29:29.887 "aliases": [ 00:29:29.887 "6d891e23-cc66-484f-a4a0-ff176577ccf0" 00:29:29.887 ], 00:29:29.887 "product_name": "NVMe disk", 00:29:29.887 "block_size": 512, 00:29:29.887 "num_blocks": 1953525168, 00:29:29.887 "uuid": "6d891e23-cc66-484f-a4a0-ff176577ccf0", 00:29:29.887 "assigned_rate_limits": { 00:29:29.887 "rw_ios_per_sec": 0, 00:29:29.887 "rw_mbytes_per_sec": 0, 00:29:29.887 "r_mbytes_per_sec": 0, 00:29:29.887 "w_mbytes_per_sec": 0 00:29:29.887 }, 00:29:29.887 "claimed": false, 00:29:29.887 "zoned": false, 00:29:29.887 "supported_io_types": { 00:29:29.887 "read": true, 00:29:29.887 "write": true, 00:29:29.887 "unmap": true, 00:29:29.887 "write_zeroes": true, 00:29:29.887 "flush": true, 00:29:29.887 "reset": true, 00:29:29.887 "compare": false, 00:29:29.887 "compare_and_write": false, 00:29:29.887 "abort": true, 00:29:29.887 "nvme_admin": true, 00:29:29.887 "nvme_io": true 00:29:29.887 }, 00:29:29.887 "driver_specific": { 00:29:29.887 "nvme": [ 00:29:29.887 { 00:29:29.887 "pci_address": "0000:5e:00.0", 00:29:29.887 "trid": { 00:29:29.887 "trtype": "PCIe", 00:29:29.887 "traddr": "0000:5e:00.0" 00:29:29.887 }, 00:29:29.887 "ctrlr_data": { 00:29:29.887 "cntlid": 0, 00:29:29.887 "vendor_id": "0x8086", 00:29:29.887 "model_number": "INTEL SSDPE2KX010T8", 00:29:29.887 "serial_number": "BTLJ807001JM1P0FGN", 00:29:29.887 "firmware_revision": "VDV10170", 00:29:29.887 "oacs": { 00:29:29.887 "security": 1, 00:29:29.887 "format": 1, 00:29:29.887 "firmware": 1, 00:29:29.887 "ns_manage": 1 00:29:29.887 }, 00:29:29.887 "multi_ctrlr": false, 00:29:29.887 "ana_reporting": false 00:29:29.887 }, 00:29:29.887 "vs": { 00:29:29.887 "nvme_version": "1.2" 00:29:29.887 }, 00:29:29.887 "ns_data": { 00:29:29.887 "id": 1, 00:29:29.887 "can_share": false 00:29:29.887 }, 00:29:29.887 "security": { 00:29:29.887 "opal": true 00:29:29.887 } 00:29:29.887 } 00:29:29.887 ], 00:29:29.887 "mp_policy": "active_passive" 00:29:29.887 } 00:29:29.887 } 00:29:29.887 ] 00:29:30.146 16:08:35 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:30.146 16:08:35 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:31.081 3950d623-ff06-40ac-bfac-4007679a2261 00:29:31.081 16:08:36 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:31.339 6a836b94-dd02-437c-ac4b-0b97d097dfab 00:29:31.339 16:08:36 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:31.339 16:08:36 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:31.597 16:08:37 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:31.855 [ 00:29:31.855 { 00:29:31.855 "name": "6a836b94-dd02-437c-ac4b-0b97d097dfab", 00:29:31.855 "aliases": [ 00:29:31.855 "lvs0/lv0" 00:29:31.855 ], 00:29:31.855 "product_name": "Logical Volume", 00:29:31.855 "block_size": 512, 00:29:31.855 "num_blocks": 204800, 00:29:31.855 "uuid": "6a836b94-dd02-437c-ac4b-0b97d097dfab", 00:29:31.855 "assigned_rate_limits": { 00:29:31.855 "rw_ios_per_sec": 0, 00:29:31.855 "rw_mbytes_per_sec": 0, 00:29:31.855 "r_mbytes_per_sec": 0, 00:29:31.855 "w_mbytes_per_sec": 0 00:29:31.855 }, 00:29:31.855 "claimed": false, 00:29:31.855 "zoned": false, 00:29:31.855 "supported_io_types": { 00:29:31.855 "read": true, 00:29:31.855 "write": true, 00:29:31.855 "unmap": true, 00:29:31.855 "write_zeroes": true, 00:29:31.855 "flush": false, 00:29:31.855 "reset": true, 00:29:31.855 "compare": false, 00:29:31.855 "compare_and_write": false, 00:29:31.855 "abort": false, 00:29:31.855 "nvme_admin": false, 00:29:31.855 "nvme_io": false 00:29:31.855 }, 00:29:31.855 "driver_specific": { 00:29:31.855 "lvol": { 00:29:31.855 "lvol_store_uuid": "3950d623-ff06-40ac-bfac-4007679a2261", 00:29:31.855 "base_bdev": "Nvme0n1", 00:29:31.855 "thin_provision": true, 00:29:31.855 "num_allocated_clusters": 0, 00:29:31.855 "snapshot": false, 00:29:31.855 "clone": false, 00:29:31.855 "esnap_clone": false 00:29:31.855 } 00:29:31.855 } 00:29:31.855 } 00:29:31.855 ] 00:29:32.113 16:08:37 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:32.113 16:08:37 compress_isal -- compress/compress.sh@41 -- # '[' -z 512 ']' 00:29:32.113 16:08:37 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 512 00:29:32.113 [2024-06-10 16:08:37.608266] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:32.113 COMP_lvs0/lv0 00:29:32.371 16:08:37 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:32.371 16:08:37 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:32.629 16:08:37 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:32.888 [ 00:29:32.888 { 00:29:32.888 "name": "COMP_lvs0/lv0", 00:29:32.888 "aliases": [ 00:29:32.888 "e998e555-ac28-5551-ae4a-86ecd4e51dd9" 00:29:32.888 ], 00:29:32.888 "product_name": "compress", 00:29:32.888 "block_size": 512, 00:29:32.888 "num_blocks": 200704, 00:29:32.888 "uuid": "e998e555-ac28-5551-ae4a-86ecd4e51dd9", 00:29:32.888 "assigned_rate_limits": { 00:29:32.888 "rw_ios_per_sec": 0, 00:29:32.888 "rw_mbytes_per_sec": 0, 00:29:32.888 "r_mbytes_per_sec": 0, 00:29:32.888 "w_mbytes_per_sec": 0 00:29:32.888 }, 00:29:32.888 "claimed": false, 00:29:32.888 "zoned": false, 00:29:32.888 "supported_io_types": { 00:29:32.888 "read": true, 00:29:32.888 "write": true, 00:29:32.888 "unmap": false, 00:29:32.888 "write_zeroes": true, 00:29:32.888 "flush": false, 00:29:32.888 "reset": false, 00:29:32.888 "compare": false, 00:29:32.888 "compare_and_write": false, 00:29:32.888 "abort": false, 00:29:32.888 "nvme_admin": false, 00:29:32.888 "nvme_io": false 00:29:32.888 }, 00:29:32.888 "driver_specific": { 00:29:32.888 "compress": { 00:29:32.888 "name": "COMP_lvs0/lv0", 00:29:32.888 "base_bdev_name": "6a836b94-dd02-437c-ac4b-0b97d097dfab" 00:29:32.888 } 00:29:32.888 } 00:29:32.888 } 00:29:32.888 ] 00:29:32.888 16:08:38 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:32.888 16:08:38 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:32.888 Running I/O for 3 seconds... 00:29:36.172 00:29:36.172 Latency(us) 00:29:36.172 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:36.172 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:36.172 Verification LBA range: start 0x0 length 0x3100 00:29:36.172 COMP_lvs0/lv0 : 3.01 3303.12 12.90 0.00 0.00 9620.72 57.54 14917.24 00:29:36.172 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:36.172 Verification LBA range: start 0x3100 length 0x3100 00:29:36.172 COMP_lvs0/lv0 : 3.01 3303.13 12.90 0.00 0.00 9638.36 54.86 14917.24 00:29:36.172 =================================================================================================================== 00:29:36.172 Total : 6606.25 25.81 0.00 0.00 9629.54 54.86 14917.24 00:29:36.172 0 00:29:36.172 16:08:41 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:36.172 16:08:41 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:36.172 16:08:41 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:36.431 16:08:41 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:36.431 16:08:41 compress_isal -- compress/compress.sh@78 -- # killprocess 2846987 00:29:36.431 16:08:41 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 2846987 ']' 00:29:36.431 16:08:41 compress_isal -- common/autotest_common.sh@953 -- # kill -0 2846987 00:29:36.431 16:08:41 compress_isal -- common/autotest_common.sh@954 -- # uname 00:29:36.431 16:08:41 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2846987 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2846987' 00:29:36.432 killing process with pid 2846987 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@968 -- # kill 2846987 00:29:36.432 Received shutdown signal, test time was about 3.000000 seconds 00:29:36.432 00:29:36.432 Latency(us) 00:29:36.432 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:36.432 =================================================================================================================== 00:29:36.432 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:36.432 16:08:41 compress_isal -- common/autotest_common.sh@973 -- # wait 2846987 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@88 -- # run_bdevperf 32 4096 3 4096 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@66 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@71 -- # bdevperf_pid=2849048 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@72 -- # trap 'killprocess $bdevperf_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@69 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -q 32 -o 4096 -w verify -t 3 -C -m 0x6 00:29:38.334 16:08:43 compress_isal -- compress/compress.sh@73 -- # waitforlisten 2849048 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 2849048 ']' 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:38.334 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:38.334 16:08:43 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:38.334 [2024-06-10 16:08:43.415485] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:29:38.334 [2024-06-10 16:08:43.415543] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x6 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2849048 ] 00:29:38.334 [2024-06-10 16:08:43.508127] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:29:38.334 [2024-06-10 16:08:43.601552] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:38.334 [2024-06-10 16:08:43.601559] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:38.902 16:08:44 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:38.902 16:08:44 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:29:38.902 16:08:44 compress_isal -- compress/compress.sh@74 -- # create_vols 4096 00:29:38.902 16:08:44 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:38.902 16:08:44 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:42.187 16:08:47 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:42.187 16:08:47 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:29:42.187 16:08:47 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:42.187 16:08:47 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:42.187 16:08:47 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:42.188 16:08:47 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:42.188 16:08:47 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:42.188 16:08:47 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:42.446 [ 00:29:42.446 { 00:29:42.446 "name": "Nvme0n1", 00:29:42.446 "aliases": [ 00:29:42.446 "b87d9213-78cd-4ec3-bc7b-f32167fd882e" 00:29:42.446 ], 00:29:42.446 "product_name": "NVMe disk", 00:29:42.446 "block_size": 512, 00:29:42.446 "num_blocks": 1953525168, 00:29:42.446 "uuid": "b87d9213-78cd-4ec3-bc7b-f32167fd882e", 00:29:42.446 "assigned_rate_limits": { 00:29:42.446 "rw_ios_per_sec": 0, 00:29:42.446 "rw_mbytes_per_sec": 0, 00:29:42.446 "r_mbytes_per_sec": 0, 00:29:42.446 "w_mbytes_per_sec": 0 00:29:42.446 }, 00:29:42.446 "claimed": false, 00:29:42.446 "zoned": false, 00:29:42.446 "supported_io_types": { 00:29:42.446 "read": true, 00:29:42.446 "write": true, 00:29:42.446 "unmap": true, 00:29:42.446 "write_zeroes": true, 00:29:42.446 "flush": true, 00:29:42.446 "reset": true, 00:29:42.446 "compare": false, 00:29:42.446 "compare_and_write": false, 00:29:42.446 "abort": true, 00:29:42.446 "nvme_admin": true, 00:29:42.446 "nvme_io": true 00:29:42.446 }, 00:29:42.446 "driver_specific": { 00:29:42.446 "nvme": [ 00:29:42.446 { 00:29:42.446 "pci_address": "0000:5e:00.0", 00:29:42.446 "trid": { 00:29:42.446 "trtype": "PCIe", 00:29:42.446 "traddr": "0000:5e:00.0" 00:29:42.446 }, 00:29:42.446 "ctrlr_data": { 00:29:42.446 "cntlid": 0, 00:29:42.446 "vendor_id": "0x8086", 00:29:42.446 "model_number": "INTEL SSDPE2KX010T8", 00:29:42.446 "serial_number": "BTLJ807001JM1P0FGN", 00:29:42.446 "firmware_revision": "VDV10170", 00:29:42.446 "oacs": { 00:29:42.446 "security": 1, 00:29:42.446 "format": 1, 00:29:42.446 "firmware": 1, 00:29:42.446 "ns_manage": 1 00:29:42.446 }, 00:29:42.446 "multi_ctrlr": false, 00:29:42.446 "ana_reporting": false 00:29:42.446 }, 00:29:42.446 "vs": { 00:29:42.446 "nvme_version": "1.2" 00:29:42.446 }, 00:29:42.446 "ns_data": { 00:29:42.447 "id": 1, 00:29:42.447 "can_share": false 00:29:42.447 }, 00:29:42.447 "security": { 00:29:42.447 "opal": true 00:29:42.447 } 00:29:42.447 } 00:29:42.447 ], 00:29:42.447 "mp_policy": "active_passive" 00:29:42.447 } 00:29:42.447 } 00:29:42.447 ] 00:29:42.447 16:08:47 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:42.447 16:08:47 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:43.825 04449039-af4f-4494-946e-46e78b043146 00:29:43.825 16:08:48 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:43.825 46b0b2d0-fdf5-42e0-875a-5207036d84d2 00:29:43.825 16:08:49 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:43.825 16:08:49 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:44.084 16:08:49 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:44.343 [ 00:29:44.343 { 00:29:44.343 "name": "46b0b2d0-fdf5-42e0-875a-5207036d84d2", 00:29:44.343 "aliases": [ 00:29:44.343 "lvs0/lv0" 00:29:44.343 ], 00:29:44.343 "product_name": "Logical Volume", 00:29:44.343 "block_size": 512, 00:29:44.343 "num_blocks": 204800, 00:29:44.343 "uuid": "46b0b2d0-fdf5-42e0-875a-5207036d84d2", 00:29:44.343 "assigned_rate_limits": { 00:29:44.343 "rw_ios_per_sec": 0, 00:29:44.343 "rw_mbytes_per_sec": 0, 00:29:44.343 "r_mbytes_per_sec": 0, 00:29:44.343 "w_mbytes_per_sec": 0 00:29:44.343 }, 00:29:44.343 "claimed": false, 00:29:44.343 "zoned": false, 00:29:44.343 "supported_io_types": { 00:29:44.343 "read": true, 00:29:44.343 "write": true, 00:29:44.343 "unmap": true, 00:29:44.343 "write_zeroes": true, 00:29:44.343 "flush": false, 00:29:44.343 "reset": true, 00:29:44.343 "compare": false, 00:29:44.343 "compare_and_write": false, 00:29:44.343 "abort": false, 00:29:44.343 "nvme_admin": false, 00:29:44.343 "nvme_io": false 00:29:44.343 }, 00:29:44.343 "driver_specific": { 00:29:44.343 "lvol": { 00:29:44.343 "lvol_store_uuid": "04449039-af4f-4494-946e-46e78b043146", 00:29:44.343 "base_bdev": "Nvme0n1", 00:29:44.343 "thin_provision": true, 00:29:44.343 "num_allocated_clusters": 0, 00:29:44.343 "snapshot": false, 00:29:44.343 "clone": false, 00:29:44.343 "esnap_clone": false 00:29:44.343 } 00:29:44.343 } 00:29:44.343 } 00:29:44.343 ] 00:29:44.343 16:08:49 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:44.343 16:08:49 compress_isal -- compress/compress.sh@41 -- # '[' -z 4096 ']' 00:29:44.343 16:08:49 compress_isal -- compress/compress.sh@44 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem -l 4096 00:29:44.603 [2024-06-10 16:08:50.021037] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:44.603 COMP_lvs0/lv0 00:29:44.603 16:08:50 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:44.603 16:08:50 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:44.861 16:08:50 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:45.119 [ 00:29:45.119 { 00:29:45.119 "name": "COMP_lvs0/lv0", 00:29:45.119 "aliases": [ 00:29:45.119 "e6716185-dad0-5bae-806c-2cb267b79b9f" 00:29:45.119 ], 00:29:45.119 "product_name": "compress", 00:29:45.119 "block_size": 4096, 00:29:45.119 "num_blocks": 25088, 00:29:45.119 "uuid": "e6716185-dad0-5bae-806c-2cb267b79b9f", 00:29:45.119 "assigned_rate_limits": { 00:29:45.119 "rw_ios_per_sec": 0, 00:29:45.119 "rw_mbytes_per_sec": 0, 00:29:45.119 "r_mbytes_per_sec": 0, 00:29:45.119 "w_mbytes_per_sec": 0 00:29:45.119 }, 00:29:45.119 "claimed": false, 00:29:45.119 "zoned": false, 00:29:45.119 "supported_io_types": { 00:29:45.119 "read": true, 00:29:45.119 "write": true, 00:29:45.119 "unmap": false, 00:29:45.119 "write_zeroes": true, 00:29:45.119 "flush": false, 00:29:45.119 "reset": false, 00:29:45.119 "compare": false, 00:29:45.119 "compare_and_write": false, 00:29:45.119 "abort": false, 00:29:45.119 "nvme_admin": false, 00:29:45.119 "nvme_io": false 00:29:45.119 }, 00:29:45.119 "driver_specific": { 00:29:45.119 "compress": { 00:29:45.119 "name": "COMP_lvs0/lv0", 00:29:45.119 "base_bdev_name": "46b0b2d0-fdf5-42e0-875a-5207036d84d2" 00:29:45.119 } 00:29:45.119 } 00:29:45.119 } 00:29:45.119 ] 00:29:45.119 16:08:50 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:45.119 16:08:50 compress_isal -- compress/compress.sh@75 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:29:45.377 Running I/O for 3 seconds... 00:29:48.675 00:29:48.675 Latency(us) 00:29:48.675 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:48.675 Job: COMP_lvs0/lv0 (Core Mask 0x2, workload: verify, depth: 32, IO size: 4096) 00:29:48.675 Verification LBA range: start 0x0 length 0x3100 00:29:48.675 COMP_lvs0/lv0 : 3.01 3273.03 12.79 0.00 0.00 9708.86 59.49 15104.49 00:29:48.675 Job: COMP_lvs0/lv0 (Core Mask 0x4, workload: verify, depth: 32, IO size: 4096) 00:29:48.675 Verification LBA range: start 0x3100 length 0x3100 00:29:48.675 COMP_lvs0/lv0 : 3.01 3271.49 12.78 0.00 0.00 9727.74 56.56 14605.17 00:29:48.675 =================================================================================================================== 00:29:48.675 Total : 6544.52 25.56 0.00 0.00 9718.30 56.56 15104.49 00:29:48.675 0 00:29:48.675 16:08:53 compress_isal -- compress/compress.sh@76 -- # destroy_vols 00:29:48.675 16:08:53 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:48.675 16:08:53 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:48.934 16:08:54 compress_isal -- compress/compress.sh@77 -- # trap - SIGINT SIGTERM EXIT 00:29:48.934 16:08:54 compress_isal -- compress/compress.sh@78 -- # killprocess 2849048 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 2849048 ']' 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@953 -- # kill -0 2849048 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@954 -- # uname 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2849048 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2849048' 00:29:48.934 killing process with pid 2849048 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@968 -- # kill 2849048 00:29:48.934 Received shutdown signal, test time was about 3.000000 seconds 00:29:48.934 00:29:48.934 Latency(us) 00:29:48.934 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:29:48.934 =================================================================================================================== 00:29:48.934 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:29:48.934 16:08:54 compress_isal -- common/autotest_common.sh@973 -- # wait 2849048 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@89 -- # run_bdevio 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@50 -- # [[ isal == \c\o\m\p\d\e\v ]] 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@55 -- # bdevio_pid=2851107 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@56 -- # trap 'killprocess $bdevio_pid; error_cleanup; exit 1' SIGINT SIGTERM EXIT 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@53 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w 00:29:50.312 16:08:55 compress_isal -- compress/compress.sh@57 -- # waitforlisten 2851107 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@830 -- # '[' -z 2851107 ']' 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@835 -- # local max_retries=100 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:29:50.312 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@839 -- # xtrace_disable 00:29:50.312 16:08:55 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:29:50.571 [2024-06-10 16:08:55.868381] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:29:50.571 [2024-06-10 16:08:55.868441] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2851107 ] 00:29:50.571 [2024-06-10 16:08:55.966816] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:29:50.571 [2024-06-10 16:08:56.064681] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:29:50.571 [2024-06-10 16:08:56.064778] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:29:50.571 [2024-06-10 16:08:56.064779] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:29:51.509 16:08:56 compress_isal -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:29:51.509 16:08:56 compress_isal -- common/autotest_common.sh@863 -- # return 0 00:29:51.509 16:08:56 compress_isal -- compress/compress.sh@58 -- # create_vols 00:29:51.509 16:08:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh 00:29:51.509 16:08:56 compress_isal -- compress/compress.sh@34 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py load_subsystem_config 00:29:54.798 16:08:59 compress_isal -- compress/compress.sh@35 -- # waitforbdev Nvme0n1 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=Nvme0n1 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:54.798 16:08:59 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:54.798 16:09:00 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b Nvme0n1 -t 2000 00:29:55.057 [ 00:29:55.057 { 00:29:55.057 "name": "Nvme0n1", 00:29:55.057 "aliases": [ 00:29:55.057 "b4e9648d-3553-46c9-a26e-72bab54dc296" 00:29:55.057 ], 00:29:55.057 "product_name": "NVMe disk", 00:29:55.057 "block_size": 512, 00:29:55.057 "num_blocks": 1953525168, 00:29:55.057 "uuid": "b4e9648d-3553-46c9-a26e-72bab54dc296", 00:29:55.057 "assigned_rate_limits": { 00:29:55.057 "rw_ios_per_sec": 0, 00:29:55.057 "rw_mbytes_per_sec": 0, 00:29:55.057 "r_mbytes_per_sec": 0, 00:29:55.057 "w_mbytes_per_sec": 0 00:29:55.057 }, 00:29:55.057 "claimed": false, 00:29:55.057 "zoned": false, 00:29:55.057 "supported_io_types": { 00:29:55.057 "read": true, 00:29:55.057 "write": true, 00:29:55.057 "unmap": true, 00:29:55.057 "write_zeroes": true, 00:29:55.057 "flush": true, 00:29:55.057 "reset": true, 00:29:55.057 "compare": false, 00:29:55.057 "compare_and_write": false, 00:29:55.057 "abort": true, 00:29:55.057 "nvme_admin": true, 00:29:55.057 "nvme_io": true 00:29:55.057 }, 00:29:55.057 "driver_specific": { 00:29:55.057 "nvme": [ 00:29:55.057 { 00:29:55.057 "pci_address": "0000:5e:00.0", 00:29:55.057 "trid": { 00:29:55.057 "trtype": "PCIe", 00:29:55.057 "traddr": "0000:5e:00.0" 00:29:55.057 }, 00:29:55.057 "ctrlr_data": { 00:29:55.057 "cntlid": 0, 00:29:55.057 "vendor_id": "0x8086", 00:29:55.057 "model_number": "INTEL SSDPE2KX010T8", 00:29:55.057 "serial_number": "BTLJ807001JM1P0FGN", 00:29:55.057 "firmware_revision": "VDV10170", 00:29:55.057 "oacs": { 00:29:55.057 "security": 1, 00:29:55.057 "format": 1, 00:29:55.057 "firmware": 1, 00:29:55.057 "ns_manage": 1 00:29:55.057 }, 00:29:55.057 "multi_ctrlr": false, 00:29:55.057 "ana_reporting": false 00:29:55.057 }, 00:29:55.057 "vs": { 00:29:55.057 "nvme_version": "1.2" 00:29:55.057 }, 00:29:55.057 "ns_data": { 00:29:55.057 "id": 1, 00:29:55.057 "can_share": false 00:29:55.057 }, 00:29:55.057 "security": { 00:29:55.057 "opal": true 00:29:55.057 } 00:29:55.057 } 00:29:55.057 ], 00:29:55.057 "mp_policy": "active_passive" 00:29:55.057 } 00:29:55.057 } 00:29:55.057 ] 00:29:55.057 16:09:00 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:55.057 16:09:00 compress_isal -- compress/compress.sh@37 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create_lvstore --clear-method none Nvme0n1 lvs0 00:29:56.469 927306ff-c5ca-4d96-8124-d0de092f5c71 00:29:56.469 16:09:01 compress_isal -- compress/compress.sh@38 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_create -t -l lvs0 lv0 100 00:29:56.469 685bd8bd-0cb8-4252-8c7b-16643d1bf736 00:29:56.469 16:09:01 compress_isal -- compress/compress.sh@39 -- # waitforbdev lvs0/lv0 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=lvs0/lv0 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:56.469 16:09:01 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:56.748 16:09:02 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b lvs0/lv0 -t 2000 00:29:57.007 [ 00:29:57.007 { 00:29:57.007 "name": "685bd8bd-0cb8-4252-8c7b-16643d1bf736", 00:29:57.007 "aliases": [ 00:29:57.007 "lvs0/lv0" 00:29:57.007 ], 00:29:57.007 "product_name": "Logical Volume", 00:29:57.007 "block_size": 512, 00:29:57.007 "num_blocks": 204800, 00:29:57.007 "uuid": "685bd8bd-0cb8-4252-8c7b-16643d1bf736", 00:29:57.007 "assigned_rate_limits": { 00:29:57.007 "rw_ios_per_sec": 0, 00:29:57.007 "rw_mbytes_per_sec": 0, 00:29:57.007 "r_mbytes_per_sec": 0, 00:29:57.007 "w_mbytes_per_sec": 0 00:29:57.007 }, 00:29:57.007 "claimed": false, 00:29:57.007 "zoned": false, 00:29:57.007 "supported_io_types": { 00:29:57.007 "read": true, 00:29:57.007 "write": true, 00:29:57.007 "unmap": true, 00:29:57.007 "write_zeroes": true, 00:29:57.007 "flush": false, 00:29:57.007 "reset": true, 00:29:57.007 "compare": false, 00:29:57.007 "compare_and_write": false, 00:29:57.007 "abort": false, 00:29:57.007 "nvme_admin": false, 00:29:57.007 "nvme_io": false 00:29:57.007 }, 00:29:57.007 "driver_specific": { 00:29:57.007 "lvol": { 00:29:57.007 "lvol_store_uuid": "927306ff-c5ca-4d96-8124-d0de092f5c71", 00:29:57.007 "base_bdev": "Nvme0n1", 00:29:57.007 "thin_provision": true, 00:29:57.007 "num_allocated_clusters": 0, 00:29:57.007 "snapshot": false, 00:29:57.007 "clone": false, 00:29:57.007 "esnap_clone": false 00:29:57.007 } 00:29:57.007 } 00:29:57.007 } 00:29:57.007 ] 00:29:57.007 16:09:02 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:57.007 16:09:02 compress_isal -- compress/compress.sh@41 -- # '[' -z '' ']' 00:29:57.007 16:09:02 compress_isal -- compress/compress.sh@42 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_create -b lvs0/lv0 -p /tmp/pmem 00:29:57.007 [2024-06-10 16:09:02.499925] vbdev_compress.c:1016:vbdev_compress_claim: *NOTICE*: registered io_device and virtual bdev for: COMP_lvs0/lv0 00:29:57.007 COMP_lvs0/lv0 00:29:57.266 16:09:02 compress_isal -- compress/compress.sh@46 -- # waitforbdev COMP_lvs0/lv0 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@898 -- # local bdev_name=COMP_lvs0/lv0 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@900 -- # local i 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@903 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_wait_for_examine 00:29:57.266 16:09:02 compress_isal -- common/autotest_common.sh@905 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_get_bdevs -b COMP_lvs0/lv0 -t 2000 00:29:57.525 [ 00:29:57.525 { 00:29:57.525 "name": "COMP_lvs0/lv0", 00:29:57.525 "aliases": [ 00:29:57.525 "20282718-400e-55e4-8de1-c6f7d559dcb9" 00:29:57.525 ], 00:29:57.525 "product_name": "compress", 00:29:57.525 "block_size": 512, 00:29:57.525 "num_blocks": 200704, 00:29:57.525 "uuid": "20282718-400e-55e4-8de1-c6f7d559dcb9", 00:29:57.525 "assigned_rate_limits": { 00:29:57.525 "rw_ios_per_sec": 0, 00:29:57.525 "rw_mbytes_per_sec": 0, 00:29:57.525 "r_mbytes_per_sec": 0, 00:29:57.525 "w_mbytes_per_sec": 0 00:29:57.525 }, 00:29:57.525 "claimed": false, 00:29:57.525 "zoned": false, 00:29:57.525 "supported_io_types": { 00:29:57.525 "read": true, 00:29:57.525 "write": true, 00:29:57.525 "unmap": false, 00:29:57.525 "write_zeroes": true, 00:29:57.525 "flush": false, 00:29:57.525 "reset": false, 00:29:57.525 "compare": false, 00:29:57.525 "compare_and_write": false, 00:29:57.525 "abort": false, 00:29:57.525 "nvme_admin": false, 00:29:57.525 "nvme_io": false 00:29:57.525 }, 00:29:57.525 "driver_specific": { 00:29:57.525 "compress": { 00:29:57.525 "name": "COMP_lvs0/lv0", 00:29:57.525 "base_bdev_name": "685bd8bd-0cb8-4252-8c7b-16643d1bf736" 00:29:57.525 } 00:29:57.525 } 00:29:57.525 } 00:29:57.525 ] 00:29:57.525 16:09:02 compress_isal -- common/autotest_common.sh@906 -- # return 0 00:29:57.525 16:09:02 compress_isal -- compress/compress.sh@59 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:29:57.785 I/O targets: 00:29:57.785 COMP_lvs0/lv0: 200704 blocks of 512 bytes (98 MiB) 00:29:57.785 00:29:57.785 00:29:57.785 CUnit - A unit testing framework for C - Version 2.1-3 00:29:57.785 http://cunit.sourceforge.net/ 00:29:57.785 00:29:57.785 00:29:57.785 Suite: bdevio tests on: COMP_lvs0/lv0 00:29:57.785 Test: blockdev write read block ...passed 00:29:57.785 Test: blockdev write zeroes read block ...passed 00:29:57.785 Test: blockdev write zeroes read no split ...passed 00:29:57.785 Test: blockdev write zeroes read split ...passed 00:29:57.785 Test: blockdev write zeroes read split partial ...passed 00:29:57.785 Test: blockdev reset ...[2024-06-10 16:09:03.157933] vbdev_compress.c: 252:vbdev_compress_submit_request: *ERROR*: Unknown I/O type 5 00:29:57.785 passed 00:29:57.785 Test: blockdev write read 8 blocks ...passed 00:29:57.785 Test: blockdev write read size > 128k ...passed 00:29:57.785 Test: blockdev write read invalid size ...passed 00:29:57.785 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:29:57.785 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:29:57.785 Test: blockdev write read max offset ...passed 00:29:57.785 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:29:57.785 Test: blockdev writev readv 8 blocks ...passed 00:29:57.785 Test: blockdev writev readv 30 x 1block ...passed 00:29:57.785 Test: blockdev writev readv block ...passed 00:29:57.785 Test: blockdev writev readv size > 128k ...passed 00:29:57.785 Test: blockdev writev readv size > 128k in two iovs ...passed 00:29:57.785 Test: blockdev comparev and writev ...passed 00:29:57.785 Test: blockdev nvme passthru rw ...passed 00:29:57.785 Test: blockdev nvme passthru vendor specific ...passed 00:29:57.785 Test: blockdev nvme admin passthru ...passed 00:29:57.785 Test: blockdev copy ...passed 00:29:57.785 00:29:57.785 Run Summary: Type Total Ran Passed Failed Inactive 00:29:57.785 suites 1 1 n/a 0 0 00:29:57.785 tests 23 23 23 0 0 00:29:57.785 asserts 130 130 130 0 n/a 00:29:57.785 00:29:57.785 Elapsed time = 0.177 seconds 00:29:57.785 0 00:29:57.785 16:09:03 compress_isal -- compress/compress.sh@60 -- # destroy_vols 00:29:57.785 16:09:03 compress_isal -- compress/compress.sh@29 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_compress_delete COMP_lvs0/lv0 00:29:58.044 16:09:03 compress_isal -- compress/compress.sh@30 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py bdev_lvol_delete_lvstore -l lvs0 00:29:58.303 16:09:03 compress_isal -- compress/compress.sh@61 -- # trap - SIGINT SIGTERM EXIT 00:29:58.303 16:09:03 compress_isal -- compress/compress.sh@62 -- # killprocess 2851107 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@949 -- # '[' -z 2851107 ']' 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@953 -- # kill -0 2851107 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@954 -- # uname 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2851107 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2851107' 00:29:58.303 killing process with pid 2851107 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@968 -- # kill 2851107 00:29:58.303 16:09:03 compress_isal -- common/autotest_common.sh@973 -- # wait 2851107 00:30:00.209 16:09:05 compress_isal -- compress/compress.sh@91 -- # '[' 0 -eq 1 ']' 00:30:00.209 16:09:05 compress_isal -- compress/compress.sh@120 -- # rm -rf /tmp/pmem 00:30:00.209 00:30:00.209 real 0m46.983s 00:30:00.209 user 1m49.497s 00:30:00.209 sys 0m3.379s 00:30:00.209 16:09:05 compress_isal -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:00.209 16:09:05 compress_isal -- common/autotest_common.sh@10 -- # set +x 00:30:00.209 ************************************ 00:30:00.209 END TEST compress_isal 00:30:00.209 ************************************ 00:30:00.209 16:09:05 -- spdk/autotest.sh@352 -- # '[' 0 -eq 1 ']' 00:30:00.209 16:09:05 -- spdk/autotest.sh@356 -- # '[' 1 -eq 1 ']' 00:30:00.209 16:09:05 -- spdk/autotest.sh@357 -- # run_test blockdev_crypto_aesni /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:00.209 16:09:05 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:30:00.209 16:09:05 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:00.209 16:09:05 -- common/autotest_common.sh@10 -- # set +x 00:30:00.209 ************************************ 00:30:00.209 START TEST blockdev_crypto_aesni 00:30:00.209 ************************************ 00:30:00.209 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_aesni 00:30:00.209 * Looking for test storage... 00:30:00.209 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/nbd_common.sh@6 -- # set -e 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@20 -- # : 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:30:00.209 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # uname -s 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@682 -- # test_type=crypto_aesni 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@683 -- # crypto_device= 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@684 -- # dek= 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@685 -- # env_ctx= 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == bdev ]] 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@690 -- # [[ crypto_aesni == crypto_* ]] 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2852698 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@49 -- # waitforlisten 2852698 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@830 -- # '[' -z 2852698 ']' 00:30:00.210 16:09:05 blockdev_crypto_aesni -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:00.210 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:00.210 16:09:05 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:00.210 [2024-06-10 16:09:05.498262] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:00.210 [2024-06-10 16:09:05.498326] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2852698 ] 00:30:00.210 [2024-06-10 16:09:05.596059] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:00.210 [2024-06-10 16:09:05.693199] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:01.147 16:09:06 blockdev_crypto_aesni -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:01.147 16:09:06 blockdev_crypto_aesni -- common/autotest_common.sh@863 -- # return 0 00:30:01.147 16:09:06 blockdev_crypto_aesni -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:30:01.147 16:09:06 blockdev_crypto_aesni -- bdev/blockdev.sh@705 -- # setup_crypto_aesni_conf 00:30:01.147 16:09:06 blockdev_crypto_aesni -- bdev/blockdev.sh@146 -- # rpc_cmd 00:30:01.147 16:09:06 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:01.147 16:09:06 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:01.147 [2024-06-10 16:09:06.451501] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:01.147 [2024-06-10 16:09:06.459541] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:01.147 [2024-06-10 16:09:06.467554] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:01.147 [2024-06-10 16:09:06.538823] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:03.681 true 00:30:03.681 true 00:30:03.681 true 00:30:03.681 true 00:30:03.681 Malloc0 00:30:03.681 Malloc1 00:30:03.681 Malloc2 00:30:03.681 Malloc3 00:30:03.681 [2024-06-10 16:09:08.887328] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:03.681 crypto_ram 00:30:03.681 [2024-06-10 16:09:08.895348] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:03.681 crypto_ram2 00:30:03.681 [2024-06-10 16:09:08.903373] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:03.681 crypto_ram3 00:30:03.681 [2024-06-10 16:09:08.911392] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:03.681 crypto_ram4 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # cat 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:30:03.681 16:09:08 blockdev_crypto_aesni -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@560 -- # xtrace_disable 00:30:03.681 16:09:08 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:03.681 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:30:03.681 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:30:03.681 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # jq -r .name 00:30:03.681 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d3828e79-16a3-5a32-b9d4-0e1e4010aab6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d3828e79-16a3-5a32-b9d4-0e1e4010aab6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "93d62114-d3cd-51a7-9cac-f32c4c639afd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "93d62114-d3cd-51a7-9cac-f32c4c639afd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "8e67e202-0f9b-506b-a5f6-73fa456c7b49"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "8e67e202-0f9b-506b-a5f6-73fa456c7b49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:03.681 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:30:03.681 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:30:03.682 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:30:03.682 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@754 -- # killprocess 2852698 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@949 -- # '[' -z 2852698 ']' 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@953 -- # kill -0 2852698 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # uname 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2852698 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2852698' 00:30:03.682 killing process with pid 2852698 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@968 -- # kill 2852698 00:30:03.682 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@973 -- # wait 2852698 00:30:04.249 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:04.249 16:09:09 blockdev_crypto_aesni -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:04.249 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:30:04.249 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:04.249 16:09:09 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:04.249 ************************************ 00:30:04.249 START TEST bdev_hello_world 00:30:04.249 ************************************ 00:30:04.249 16:09:09 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:30:04.249 [2024-06-10 16:09:09.671975] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:04.249 [2024-06-10 16:09:09.672026] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2853680 ] 00:30:04.507 [2024-06-10 16:09:09.769109] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:04.507 [2024-06-10 16:09:09.860585] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:04.507 [2024-06-10 16:09:09.881873] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:04.507 [2024-06-10 16:09:09.889908] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:04.507 [2024-06-10 16:09:09.897920] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:04.507 [2024-06-10 16:09:10.007994] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:07.038 [2024-06-10 16:09:12.195843] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:07.038 [2024-06-10 16:09:12.195906] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:07.038 [2024-06-10 16:09:12.195918] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:07.038 [2024-06-10 16:09:12.203861] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:07.038 [2024-06-10 16:09:12.203879] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:07.038 [2024-06-10 16:09:12.203888] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:07.038 [2024-06-10 16:09:12.211881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:07.038 [2024-06-10 16:09:12.211898] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:07.038 [2024-06-10 16:09:12.211906] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:07.038 [2024-06-10 16:09:12.219902] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:07.038 [2024-06-10 16:09:12.219918] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:07.038 [2024-06-10 16:09:12.219927] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:07.038 [2024-06-10 16:09:12.292425] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:30:07.038 [2024-06-10 16:09:12.292464] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:30:07.038 [2024-06-10 16:09:12.292480] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:30:07.038 [2024-06-10 16:09:12.293827] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:30:07.038 [2024-06-10 16:09:12.293903] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:30:07.038 [2024-06-10 16:09:12.293918] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:30:07.038 [2024-06-10 16:09:12.293981] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:30:07.038 00:30:07.038 [2024-06-10 16:09:12.294000] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:30:07.297 00:30:07.297 real 0m3.005s 00:30:07.297 user 0m2.695s 00:30:07.297 sys 0m0.271s 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:30:07.297 ************************************ 00:30:07.297 END TEST bdev_hello_world 00:30:07.297 ************************************ 00:30:07.297 16:09:12 blockdev_crypto_aesni -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:30:07.297 16:09:12 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:30:07.297 16:09:12 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:07.297 16:09:12 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:07.297 ************************************ 00:30:07.297 START TEST bdev_bounds 00:30:07.297 ************************************ 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2854376 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2854376' 00:30:07.297 Process bdevio pid: 2854376 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2854376 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 2854376 ']' 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:30:07.297 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:07.297 16:09:12 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:07.297 [2024-06-10 16:09:12.751240] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:07.297 [2024-06-10 16:09:12.751294] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2854376 ] 00:30:07.556 [2024-06-10 16:09:12.849583] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:30:07.556 [2024-06-10 16:09:12.945645] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:07.556 [2024-06-10 16:09:12.945742] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:30:07.556 [2024-06-10 16:09:12.945743] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:07.556 [2024-06-10 16:09:12.967087] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:07.556 [2024-06-10 16:09:12.975112] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:07.556 [2024-06-10 16:09:12.983132] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:07.815 [2024-06-10 16:09:13.090115] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:10.349 [2024-06-10 16:09:15.273220] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:10.349 [2024-06-10 16:09:15.273289] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:10.349 [2024-06-10 16:09:15.273302] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.349 [2024-06-10 16:09:15.281240] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:10.349 [2024-06-10 16:09:15.281258] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:10.349 [2024-06-10 16:09:15.281268] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.349 [2024-06-10 16:09:15.289263] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:10.349 [2024-06-10 16:09:15.289279] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:10.349 [2024-06-10 16:09:15.289288] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.349 [2024-06-10 16:09:15.297285] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:10.349 [2024-06-10 16:09:15.297307] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:10.349 [2024-06-10 16:09:15.297315] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:10.349 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:10.349 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:30:10.349 16:09:15 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:30:10.349 I/O targets: 00:30:10.349 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:30:10.349 crypto_ram2: 65536 blocks of 512 bytes (32 MiB) 00:30:10.349 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:30:10.349 crypto_ram4: 8192 blocks of 4096 bytes (32 MiB) 00:30:10.349 00:30:10.349 00:30:10.349 CUnit - A unit testing framework for C - Version 2.1-3 00:30:10.349 http://cunit.sourceforge.net/ 00:30:10.349 00:30:10.349 00:30:10.349 Suite: bdevio tests on: crypto_ram4 00:30:10.349 Test: blockdev write read block ...passed 00:30:10.349 Test: blockdev write zeroes read block ...passed 00:30:10.349 Test: blockdev write zeroes read no split ...passed 00:30:10.349 Test: blockdev write zeroes read split ...passed 00:30:10.349 Test: blockdev write zeroes read split partial ...passed 00:30:10.349 Test: blockdev reset ...passed 00:30:10.349 Test: blockdev write read 8 blocks ...passed 00:30:10.349 Test: blockdev write read size > 128k ...passed 00:30:10.349 Test: blockdev write read invalid size ...passed 00:30:10.349 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:10.349 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:10.349 Test: blockdev write read max offset ...passed 00:30:10.349 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:10.349 Test: blockdev writev readv 8 blocks ...passed 00:30:10.349 Test: blockdev writev readv 30 x 1block ...passed 00:30:10.349 Test: blockdev writev readv block ...passed 00:30:10.349 Test: blockdev writev readv size > 128k ...passed 00:30:10.349 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:10.349 Test: blockdev comparev and writev ...passed 00:30:10.349 Test: blockdev nvme passthru rw ...passed 00:30:10.349 Test: blockdev nvme passthru vendor specific ...passed 00:30:10.349 Test: blockdev nvme admin passthru ...passed 00:30:10.349 Test: blockdev copy ...passed 00:30:10.349 Suite: bdevio tests on: crypto_ram3 00:30:10.349 Test: blockdev write read block ...passed 00:30:10.349 Test: blockdev write zeroes read block ...passed 00:30:10.349 Test: blockdev write zeroes read no split ...passed 00:30:10.349 Test: blockdev write zeroes read split ...passed 00:30:10.349 Test: blockdev write zeroes read split partial ...passed 00:30:10.349 Test: blockdev reset ...passed 00:30:10.349 Test: blockdev write read 8 blocks ...passed 00:30:10.349 Test: blockdev write read size > 128k ...passed 00:30:10.349 Test: blockdev write read invalid size ...passed 00:30:10.349 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:10.349 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:10.349 Test: blockdev write read max offset ...passed 00:30:10.349 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:10.349 Test: blockdev writev readv 8 blocks ...passed 00:30:10.349 Test: blockdev writev readv 30 x 1block ...passed 00:30:10.349 Test: blockdev writev readv block ...passed 00:30:10.349 Test: blockdev writev readv size > 128k ...passed 00:30:10.349 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:10.349 Test: blockdev comparev and writev ...passed 00:30:10.349 Test: blockdev nvme passthru rw ...passed 00:30:10.349 Test: blockdev nvme passthru vendor specific ...passed 00:30:10.349 Test: blockdev nvme admin passthru ...passed 00:30:10.349 Test: blockdev copy ...passed 00:30:10.349 Suite: bdevio tests on: crypto_ram2 00:30:10.349 Test: blockdev write read block ...passed 00:30:10.349 Test: blockdev write zeroes read block ...passed 00:30:10.349 Test: blockdev write zeroes read no split ...passed 00:30:10.349 Test: blockdev write zeroes read split ...passed 00:30:10.349 Test: blockdev write zeroes read split partial ...passed 00:30:10.349 Test: blockdev reset ...passed 00:30:10.349 Test: blockdev write read 8 blocks ...passed 00:30:10.349 Test: blockdev write read size > 128k ...passed 00:30:10.349 Test: blockdev write read invalid size ...passed 00:30:10.349 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:10.349 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:10.349 Test: blockdev write read max offset ...passed 00:30:10.349 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:10.349 Test: blockdev writev readv 8 blocks ...passed 00:30:10.349 Test: blockdev writev readv 30 x 1block ...passed 00:30:10.349 Test: blockdev writev readv block ...passed 00:30:10.349 Test: blockdev writev readv size > 128k ...passed 00:30:10.349 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:10.349 Test: blockdev comparev and writev ...passed 00:30:10.349 Test: blockdev nvme passthru rw ...passed 00:30:10.349 Test: blockdev nvme passthru vendor specific ...passed 00:30:10.349 Test: blockdev nvme admin passthru ...passed 00:30:10.349 Test: blockdev copy ...passed 00:30:10.349 Suite: bdevio tests on: crypto_ram 00:30:10.349 Test: blockdev write read block ...passed 00:30:10.349 Test: blockdev write zeroes read block ...passed 00:30:10.349 Test: blockdev write zeroes read no split ...passed 00:30:10.349 Test: blockdev write zeroes read split ...passed 00:30:10.349 Test: blockdev write zeroes read split partial ...passed 00:30:10.349 Test: blockdev reset ...passed 00:30:10.349 Test: blockdev write read 8 blocks ...passed 00:30:10.349 Test: blockdev write read size > 128k ...passed 00:30:10.349 Test: blockdev write read invalid size ...passed 00:30:10.349 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:30:10.349 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:30:10.349 Test: blockdev write read max offset ...passed 00:30:10.349 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:30:10.349 Test: blockdev writev readv 8 blocks ...passed 00:30:10.349 Test: blockdev writev readv 30 x 1block ...passed 00:30:10.349 Test: blockdev writev readv block ...passed 00:30:10.349 Test: blockdev writev readv size > 128k ...passed 00:30:10.349 Test: blockdev writev readv size > 128k in two iovs ...passed 00:30:10.349 Test: blockdev comparev and writev ...passed 00:30:10.349 Test: blockdev nvme passthru rw ...passed 00:30:10.349 Test: blockdev nvme passthru vendor specific ...passed 00:30:10.349 Test: blockdev nvme admin passthru ...passed 00:30:10.349 Test: blockdev copy ...passed 00:30:10.349 00:30:10.349 Run Summary: Type Total Ran Passed Failed Inactive 00:30:10.349 suites 4 4 n/a 0 0 00:30:10.349 tests 92 92 92 0 0 00:30:10.349 asserts 520 520 520 0 n/a 00:30:10.349 00:30:10.349 Elapsed time = 0.535 seconds 00:30:10.349 0 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2854376 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 2854376 ']' 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 2854376 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2854376 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2854376' 00:30:10.610 killing process with pid 2854376 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@968 -- # kill 2854376 00:30:10.610 16:09:15 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@973 -- # wait 2854376 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:30:10.869 00:30:10.869 real 0m3.570s 00:30:10.869 user 0m10.184s 00:30:10.869 sys 0m0.478s 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:30:10.869 ************************************ 00:30:10.869 END TEST bdev_bounds 00:30:10.869 ************************************ 00:30:10.869 16:09:16 blockdev_crypto_aesni -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:10.869 16:09:16 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:30:10.869 16:09:16 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:10.869 16:09:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:10.869 ************************************ 00:30:10.869 START TEST bdev_nbd 00:30:10.869 ************************************ 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '' 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:30:10.869 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2854944 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2854944 /var/tmp/spdk-nbd.sock 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 2854944 ']' 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:30:10.870 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:30:10.870 16:09:16 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:11.128 [2024-06-10 16:09:16.404832] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:11.128 [2024-06-10 16:09:16.404895] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:30:11.128 [2024-06-10 16:09:16.504759] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:30:11.128 [2024-06-10 16:09:16.595256] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:11.128 [2024-06-10 16:09:16.616681] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:11.128 [2024-06-10 16:09:16.624707] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:11.128 [2024-06-10 16:09:16.632721] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:11.387 [2024-06-10 16:09:16.735056] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:13.919 [2024-06-10 16:09:18.925248] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:13.919 [2024-06-10 16:09:18.925312] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:13.919 [2024-06-10 16:09:18.925325] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:13.919 [2024-06-10 16:09:18.933267] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:13.919 [2024-06-10 16:09:18.933285] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:13.919 [2024-06-10 16:09:18.933295] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:13.919 [2024-06-10 16:09:18.941288] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:13.919 [2024-06-10 16:09:18.941304] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:13.919 [2024-06-10 16:09:18.941312] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:13.919 [2024-06-10 16:09:18.949310] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:13.919 [2024-06-10 16:09:18.949325] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:13.919 [2024-06-10 16:09:18.949334] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:30:13.919 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:13.920 1+0 records in 00:30:13.920 1+0 records out 00:30:13.920 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265094 s, 15.5 MB/s 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:13.920 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:30:14.177 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:30:14.177 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:30:14.177 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:30:14.177 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:14.436 1+0 records in 00:30:14.436 1+0 records out 00:30:14.436 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000238431 s, 17.2 MB/s 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:14.436 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:14.695 1+0 records in 00:30:14.695 1+0 records out 00:30:14.695 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000202987 s, 20.2 MB/s 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:14.695 16:09:19 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.695 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:14.695 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:14.695 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:14.695 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:14.695 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:14.956 1+0 records in 00:30:14.956 1+0 records out 00:30:14.956 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000265529 s, 15.4 MB/s 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:30:14.956 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd0", 00:30:15.281 "bdev_name": "crypto_ram" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd1", 00:30:15.281 "bdev_name": "crypto_ram2" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd2", 00:30:15.281 "bdev_name": "crypto_ram3" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd3", 00:30:15.281 "bdev_name": "crypto_ram4" 00:30:15.281 } 00:30:15.281 ]' 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd0", 00:30:15.281 "bdev_name": "crypto_ram" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd1", 00:30:15.281 "bdev_name": "crypto_ram2" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd2", 00:30:15.281 "bdev_name": "crypto_ram3" 00:30:15.281 }, 00:30:15.281 { 00:30:15.281 "nbd_device": "/dev/nbd3", 00:30:15.281 "bdev_name": "crypto_ram4" 00:30:15.281 } 00:30:15.281 ]' 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:15.281 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:15.541 16:09:20 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:15.799 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:30:16.058 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:30:16.058 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:30:16.058 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:16.059 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:16.318 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram2 crypto_ram3 crypto_ram4' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram2' 'crypto_ram3' 'crypto_ram4') 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:16.577 16:09:21 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:30:16.835 /dev/nbd0 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:16.835 1+0 records in 00:30:16.835 1+0 records out 00:30:16.835 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000249058 s, 16.4 MB/s 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:16.835 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd1 00:30:17.094 /dev/nbd1 00:30:17.094 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:30:17.094 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:17.095 1+0 records in 00:30:17.095 1+0 records out 00:30:17.095 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000281651 s, 14.5 MB/s 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:17.095 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd10 00:30:17.354 /dev/nbd10 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:17.354 1+0 records in 00:30:17.354 1+0 records out 00:30:17.354 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000294002 s, 13.9 MB/s 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:17.354 16:09:22 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram4 /dev/nbd11 00:30:17.613 /dev/nbd11 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:30:17.613 1+0 records in 00:30:17.613 1+0 records out 00:30:17.613 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00025334 s, 16.2 MB/s 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:17.613 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd0", 00:30:17.873 "bdev_name": "crypto_ram" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd1", 00:30:17.873 "bdev_name": "crypto_ram2" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd10", 00:30:17.873 "bdev_name": "crypto_ram3" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd11", 00:30:17.873 "bdev_name": "crypto_ram4" 00:30:17.873 } 00:30:17.873 ]' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd0", 00:30:17.873 "bdev_name": "crypto_ram" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd1", 00:30:17.873 "bdev_name": "crypto_ram2" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd10", 00:30:17.873 "bdev_name": "crypto_ram3" 00:30:17.873 }, 00:30:17.873 { 00:30:17.873 "nbd_device": "/dev/nbd11", 00:30:17.873 "bdev_name": "crypto_ram4" 00:30:17.873 } 00:30:17.873 ]' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:30:17.873 /dev/nbd1 00:30:17.873 /dev/nbd10 00:30:17.873 /dev/nbd11' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:30:17.873 /dev/nbd1 00:30:17.873 /dev/nbd10 00:30:17.873 /dev/nbd11' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:30:17.873 256+0 records in 00:30:17.873 256+0 records out 00:30:17.873 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0102165 s, 103 MB/s 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:17.873 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:30:18.132 256+0 records in 00:30:18.132 256+0 records out 00:30:18.132 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.043119 s, 24.3 MB/s 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:30:18.132 256+0 records in 00:30:18.132 256+0 records out 00:30:18.132 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0475478 s, 22.1 MB/s 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:30:18.132 256+0 records in 00:30:18.132 256+0 records out 00:30:18.132 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0422165 s, 24.8 MB/s 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:30:18.132 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:30:18.133 256+0 records in 00:30:18.133 256+0 records out 00:30:18.133 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0392586 s, 26.7 MB/s 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:18.133 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:18.392 16:09:23 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:18.651 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:18.911 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:19.171 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:30:19.430 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:30:19.430 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:30:19.430 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:30:19.689 16:09:24 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:30:19.948 malloc_lvol_verify 00:30:19.948 16:09:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:30:20.206 d542fdc0-715b-4510-ba81-268ee3cc004d 00:30:20.206 16:09:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:30:20.464 1de17d25-8136-4b93-92cf-91e5a9520fb2 00:30:20.465 16:09:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:30:20.724 /dev/nbd0 00:30:20.724 16:09:25 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:30:20.724 mke2fs 1.46.5 (30-Dec-2021) 00:30:20.724 Discarding device blocks: 0/4096 done 00:30:20.724 Creating filesystem with 4096 1k blocks and 1024 inodes 00:30:20.724 00:30:20.724 Allocating group tables: 0/1 done 00:30:20.724 Writing inode tables: 0/1 done 00:30:20.724 Creating journal (1024 blocks): done 00:30:20.724 Writing superblocks and filesystem accounting information: 0/1 done 00:30:20.724 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:30:20.724 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2854944 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 2854944 ']' 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 2854944 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2854944 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2854944' 00:30:20.983 killing process with pid 2854944 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@968 -- # kill 2854944 00:30:20.983 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@973 -- # wait 2854944 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:30:21.243 00:30:21.243 real 0m10.346s 00:30:21.243 user 0m14.588s 00:30:21.243 sys 0m3.163s 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:30:21.243 ************************************ 00:30:21.243 END TEST bdev_nbd 00:30:21.243 ************************************ 00:30:21.243 16:09:26 blockdev_crypto_aesni -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:30:21.243 16:09:26 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = nvme ']' 00:30:21.243 16:09:26 blockdev_crypto_aesni -- bdev/blockdev.sh@764 -- # '[' crypto_aesni = gpt ']' 00:30:21.243 16:09:26 blockdev_crypto_aesni -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:30:21.243 16:09:26 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:30:21.243 16:09:26 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:21.243 16:09:26 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:21.243 ************************************ 00:30:21.243 START TEST bdev_fio 00:30:21.243 ************************************ 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:30:21.243 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:30:21.243 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:30:21.502 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram4]' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram4 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:21.503 ************************************ 00:30:21.503 START TEST bdev_fio_rw_verify 00:30:21.503 ************************************ 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:21.503 16:09:26 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:21.762 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:21.762 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:21.762 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:21.762 job_crypto_ram4: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:21.762 fio-3.35 00:30:21.762 Starting 4 threads 00:30:36.674 00:30:36.674 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2857372: Mon Jun 10 16:09:39 2024 00:30:36.674 read: IOPS=20.3k, BW=79.5MiB/s (83.4MB/s)(795MiB/10001msec) 00:30:36.674 slat (usec): min=18, max=1108, avg=64.67, stdev=30.98 00:30:36.674 clat (usec): min=13, max=1863, avg=353.67, stdev=202.69 00:30:36.674 lat (usec): min=44, max=2039, avg=418.34, stdev=219.03 00:30:36.674 clat percentiles (usec): 00:30:36.674 | 50.000th=[ 306], 99.000th=[ 889], 99.900th=[ 1037], 99.990th=[ 1450], 00:30:36.674 | 99.999th=[ 1795] 00:30:36.674 write: IOPS=22.4k, BW=87.6MiB/s (91.9MB/s)(856MiB/9771msec); 0 zone resets 00:30:36.674 slat (usec): min=26, max=373, avg=77.26, stdev=28.63 00:30:36.674 clat (usec): min=38, max=1509, avg=424.88, stdev=235.20 00:30:36.674 lat (usec): min=84, max=1660, avg=502.14, stdev=248.83 00:30:36.674 clat percentiles (usec): 00:30:36.674 | 50.000th=[ 383], 99.000th=[ 1123], 99.900th=[ 1287], 99.990th=[ 1369], 00:30:36.674 | 99.999th=[ 1483] 00:30:36.674 bw ( KiB/s): min=63304, max=120024, per=98.01%, avg=87954.53, stdev=4719.61, samples=76 00:30:36.674 iops : min=15826, max=30006, avg=21988.63, stdev=1179.90, samples=76 00:30:36.674 lat (usec) : 20=0.01%, 50=0.01%, 100=4.15%, 250=27.66%, 500=40.15% 00:30:36.674 lat (usec) : 750=20.54%, 1000=6.13% 00:30:36.674 lat (msec) : 2=1.36% 00:30:36.674 cpu : usr=99.61%, sys=0.00%, ctx=71, majf=0, minf=212 00:30:36.674 IO depths : 1=9.9%, 2=25.6%, 4=51.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:36.674 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:36.674 complete : 0=0.0%, 4=88.7%, 8=11.3%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:36.674 issued rwts: total=203524,219215,0,0 short=0,0,0,0 dropped=0,0,0,0 00:30:36.674 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:36.674 00:30:36.674 Run status group 0 (all jobs): 00:30:36.674 READ: bw=79.5MiB/s (83.4MB/s), 79.5MiB/s-79.5MiB/s (83.4MB/s-83.4MB/s), io=795MiB (834MB), run=10001-10001msec 00:30:36.674 WRITE: bw=87.6MiB/s (91.9MB/s), 87.6MiB/s-87.6MiB/s (91.9MB/s-91.9MB/s), io=856MiB (898MB), run=9771-9771msec 00:30:36.674 00:30:36.674 real 0m13.403s 00:30:36.674 user 0m50.259s 00:30:36.674 sys 0m0.391s 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:30:36.674 ************************************ 00:30:36.674 END TEST bdev_fio_rw_verify 00:30:36.674 ************************************ 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:30:36.674 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d3828e79-16a3-5a32-b9d4-0e1e4010aab6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d3828e79-16a3-5a32-b9d4-0e1e4010aab6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "93d62114-d3cd-51a7-9cac-f32c4c639afd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "93d62114-d3cd-51a7-9cac-f32c4c639afd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "8e67e202-0f9b-506b-a5f6-73fa456c7b49"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "8e67e202-0f9b-506b-a5f6-73fa456c7b49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:30:36.675 crypto_ram2 00:30:36.675 crypto_ram3 00:30:36.675 crypto_ram4 ]] 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "ed4a4d74-3fe9-5e5b-b5f1-53dfdc3e8b42",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_aesni_cbc_1"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "d3828e79-16a3-5a32-b9d4-0e1e4010aab6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "d3828e79-16a3-5a32-b9d4-0e1e4010aab6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_aesni_cbc_2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "93d62114-d3cd-51a7-9cac-f32c4c639afd"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "93d62114-d3cd-51a7-9cac-f32c4c639afd",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_aesni_cbc_3"' ' }' ' }' '}' '{' ' "name": "crypto_ram4",' ' "aliases": [' ' "8e67e202-0f9b-506b-a5f6-73fa456c7b49"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "8e67e202-0f9b-506b-a5f6-73fa456c7b49",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram4",' ' "key_name": "test_dek_aesni_cbc_4"' ' }' ' }' '}' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram4]' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram4 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:36.675 ************************************ 00:30:36.675 START TEST bdev_fio_trim 00:30:36.675 ************************************ 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:30:36.675 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:30:36.676 16:09:40 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:30:36.676 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:36.676 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:36.676 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:36.676 job_crypto_ram4: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:30:36.676 fio-3.35 00:30:36.676 Starting 4 threads 00:30:48.883 00:30:48.883 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2859693: Mon Jun 10 16:09:53 2024 00:30:48.883 write: IOPS=44.9k, BW=175MiB/s (184MB/s)(1754MiB/10001msec); 0 zone resets 00:30:48.883 slat (usec): min=11, max=441, avg=51.18, stdev=39.63 00:30:48.883 clat (usec): min=30, max=1858, avg=224.85, stdev=183.46 00:30:48.883 lat (usec): min=45, max=2118, avg=276.03, stdev=212.50 00:30:48.883 clat percentiles (usec): 00:30:48.883 | 50.000th=[ 174], 99.000th=[ 1020], 99.900th=[ 1205], 99.990th=[ 1303], 00:30:48.883 | 99.999th=[ 1647] 00:30:48.883 bw ( KiB/s): min=174120, max=240044, per=100.00%, avg=179884.42, stdev=3832.58, samples=76 00:30:48.883 iops : min=43530, max=60011, avg=44971.11, stdev=958.15, samples=76 00:30:48.883 trim: IOPS=44.9k, BW=175MiB/s (184MB/s)(1754MiB/10001msec); 0 zone resets 00:30:48.883 slat (usec): min=4, max=242, avg=13.89, stdev= 7.32 00:30:48.883 clat (usec): min=43, max=1395, avg=211.99, stdev=121.28 00:30:48.883 lat (usec): min=50, max=1405, avg=225.88, stdev=125.00 00:30:48.883 clat percentiles (usec): 00:30:48.883 | 50.000th=[ 188], 99.000th=[ 709], 99.900th=[ 816], 99.990th=[ 881], 00:30:48.883 | 99.999th=[ 1237] 00:30:48.883 bw ( KiB/s): min=174128, max=240060, per=100.00%, avg=179885.68, stdev=3833.42, samples=76 00:30:48.883 iops : min=43532, max=60015, avg=44971.42, stdev=958.36, samples=76 00:30:48.883 lat (usec) : 50=1.28%, 100=12.56%, 250=61.02%, 500=19.33%, 750=4.05% 00:30:48.883 lat (usec) : 1000=1.17% 00:30:48.883 lat (msec) : 2=0.59% 00:30:48.883 cpu : usr=99.67%, sys=0.00%, ctx=98, majf=0, minf=105 00:30:48.883 IO depths : 1=8.4%, 2=26.2%, 4=52.3%, 8=13.1%, 16=0.0%, 32=0.0%, >=64=0.0% 00:30:48.883 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:48.883 complete : 0=0.0%, 4=88.4%, 8=11.6%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:30:48.883 issued rwts: total=0,448996,448996,0 short=0,0,0,0 dropped=0,0,0,0 00:30:48.883 latency : target=0, window=0, percentile=100.00%, depth=8 00:30:48.883 00:30:48.883 Run status group 0 (all jobs): 00:30:48.883 WRITE: bw=175MiB/s (184MB/s), 175MiB/s-175MiB/s (184MB/s-184MB/s), io=1754MiB (1839MB), run=10001-10001msec 00:30:48.883 TRIM: bw=175MiB/s (184MB/s), 175MiB/s-175MiB/s (184MB/s-184MB/s), io=1754MiB (1839MB), run=10001-10001msec 00:30:48.883 00:30:48.883 real 0m13.424s 00:30:48.883 user 0m50.067s 00:30:48.883 sys 0m0.418s 00:30:48.883 16:09:53 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:48.883 16:09:53 blockdev_crypto_aesni.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:30:48.883 ************************************ 00:30:48.883 END TEST bdev_fio_trim 00:30:48.883 ************************************ 00:30:48.883 16:09:53 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:30:48.883 16:09:53 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:30:48.883 16:09:53 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:30:48.883 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:30:48.884 16:09:53 blockdev_crypto_aesni.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:30:48.884 00:30:48.884 real 0m27.139s 00:30:48.884 user 1m40.504s 00:30:48.884 sys 0m0.959s 00:30:48.884 16:09:53 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:48.884 16:09:53 blockdev_crypto_aesni.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:30:48.884 ************************************ 00:30:48.884 END TEST bdev_fio 00:30:48.884 ************************************ 00:30:48.884 16:09:53 blockdev_crypto_aesni -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:30:48.884 16:09:53 blockdev_crypto_aesni -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:48.884 16:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:30:48.884 16:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:48.884 16:09:53 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:48.884 ************************************ 00:30:48.884 START TEST bdev_verify 00:30:48.884 ************************************ 00:30:48.884 16:09:53 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:30:48.884 [2024-06-10 16:09:54.007153] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:48.884 [2024-06-10 16:09:54.007188] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2861359 ] 00:30:48.884 [2024-06-10 16:09:54.093160] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:48.884 [2024-06-10 16:09:54.190249] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:48.884 [2024-06-10 16:09:54.190254] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:48.884 [2024-06-10 16:09:54.211625] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:48.884 [2024-06-10 16:09:54.219659] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:48.884 [2024-06-10 16:09:54.227670] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:48.884 [2024-06-10 16:09:54.331002] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:51.420 [2024-06-10 16:09:56.515449] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:51.420 [2024-06-10 16:09:56.515528] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:51.420 [2024-06-10 16:09:56.515540] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.420 [2024-06-10 16:09:56.523478] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:51.420 [2024-06-10 16:09:56.523499] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:51.420 [2024-06-10 16:09:56.523508] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.421 [2024-06-10 16:09:56.531488] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:51.421 [2024-06-10 16:09:56.531504] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:51.421 [2024-06-10 16:09:56.531513] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.421 [2024-06-10 16:09:56.539512] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:51.421 [2024-06-10 16:09:56.539527] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:51.421 [2024-06-10 16:09:56.539536] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:51.421 Running I/O for 5 seconds... 00:30:56.725 00:30:56.725 Latency(us) 00:30:56.725 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:30:56.725 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x0 length 0x1000 00:30:56.725 crypto_ram : 5.10 426.52 1.67 0.00 0.00 299180.27 5835.82 197731.47 00:30:56.725 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x1000 length 0x1000 00:30:56.725 crypto_ram : 5.10 426.43 1.67 0.00 0.00 299239.81 5679.79 197731.47 00:30:56.725 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x0 length 0x1000 00:30:56.725 crypto_ram2 : 5.10 426.43 1.67 0.00 0.00 297824.11 5960.66 176759.95 00:30:56.725 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x1000 length 0x1000 00:30:56.725 crypto_ram2 : 5.11 426.09 1.66 0.00 0.00 297924.64 5773.41 176759.95 00:30:56.725 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x0 length 0x1000 00:30:56.725 crypto_ram3 : 5.08 3276.28 12.80 0.00 0.00 38492.67 11172.33 34702.87 00:30:56.725 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x1000 length 0x1000 00:30:56.725 crypto_ram3 : 5.08 3275.59 12.80 0.00 0.00 38500.31 10985.08 34702.87 00:30:56.725 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x0 length 0x1000 00:30:56.725 crypto_ram4 : 5.09 3292.42 12.86 0.00 0.00 38226.57 3776.12 34453.21 00:30:56.725 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:30:56.725 Verification LBA range: start 0x1000 length 0x1000 00:30:56.725 crypto_ram4 : 5.09 3291.71 12.86 0.00 0.00 38236.85 3573.27 34702.87 00:30:56.725 =================================================================================================================== 00:30:56.725 Total : 14841.47 57.97 0.00 0.00 68355.87 3573.27 197731.47 00:30:56.725 00:30:56.725 real 0m8.157s 00:30:56.725 user 0m15.615s 00:30:56.725 sys 0m0.289s 00:30:56.725 16:10:02 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:30:56.725 16:10:02 blockdev_crypto_aesni.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:30:56.725 ************************************ 00:30:56.725 END TEST bdev_verify 00:30:56.725 ************************************ 00:30:56.725 16:10:02 blockdev_crypto_aesni -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:56.725 16:10:02 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:30:56.725 16:10:02 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:30:56.725 16:10:02 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:30:56.725 ************************************ 00:30:56.725 START TEST bdev_verify_big_io 00:30:56.725 ************************************ 00:30:56.725 16:10:02 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:30:56.984 [2024-06-10 16:10:02.243804] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:30:56.984 [2024-06-10 16:10:02.243856] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2862675 ] 00:30:56.984 [2024-06-10 16:10:02.343790] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:30:56.984 [2024-06-10 16:10:02.442026] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:30:56.984 [2024-06-10 16:10:02.442031] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:30:56.984 [2024-06-10 16:10:02.463401] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:30:56.984 [2024-06-10 16:10:02.471435] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:30:56.984 [2024-06-10 16:10:02.479446] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:30:57.243 [2024-06-10 16:10:02.583220] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:30:59.778 [2024-06-10 16:10:04.767866] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:30:59.778 [2024-06-10 16:10:04.767936] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:30:59.778 [2024-06-10 16:10:04.767949] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.778 [2024-06-10 16:10:04.775884] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:30:59.778 [2024-06-10 16:10:04.775909] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:30:59.778 [2024-06-10 16:10:04.775919] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.778 [2024-06-10 16:10:04.783905] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:30:59.778 [2024-06-10 16:10:04.783921] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:30:59.778 [2024-06-10 16:10:04.783930] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.778 [2024-06-10 16:10:04.791932] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:30:59.778 [2024-06-10 16:10:04.791947] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:30:59.778 [2024-06-10 16:10:04.791960] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:30:59.778 Running I/O for 5 seconds... 00:31:02.314 [2024-06-10 16:10:07.772976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.774780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.775897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.777495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.779765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.780218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.780636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.782502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.784603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.786366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.786976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.788459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.790025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.790456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.791415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.792869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.794999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.796392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.797689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.799153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.800718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.314 [2024-06-10 16:10:07.801158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.802831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.804344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.806464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.807105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.808558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.810306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.811969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.813399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.814855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.816593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.817963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.819707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.821243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.315 [2024-06-10 16:10:07.822983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.824975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.826745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.828623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.830521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.831513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.832991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.834732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.836704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.840837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.842560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.844362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.846234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.848508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.850359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.852146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.853866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.857122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.858921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.860671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.862305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.864194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.865931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.867671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.869000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.871864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.873619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.875362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.876077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.877944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.879690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.881436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.881862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.885147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.886993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.576 [2024-06-10 16:10:07.888733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.889673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.891820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.893573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.894969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.895389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.898482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.900241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.901276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.902930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.905099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.906843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.907522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.907941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.911035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.912837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.913492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.914941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.917149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.918862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.919283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.919702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.922590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.923976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.925298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.926761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.928885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.929894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.930336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.930753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.933715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.934386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.936249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.938008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.940133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.940567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.940992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.941796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.944782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.945691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.947157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.948896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.950767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.951200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.951623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.953110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.955586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.957047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.958653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.959674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.960695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.962145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.963759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.964454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.966108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.966539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.966599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.968343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.969349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.971065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.971117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.972932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.974698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.976563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.976632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.978361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.978961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.980557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.980607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.982074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.983430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.985041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.985091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.986237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.986724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.988339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.988389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.989242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.990823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.992342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.992392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.992815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.993252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.995076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.995142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.995566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.996954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.998608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.998660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:07.999697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.577 [2024-06-10 16:10:08.000220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.001556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.001611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.002032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.004589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.004649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.006302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.006348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.007587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.007647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.008067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.008111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.009797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.009856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.011394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.011450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.012265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.012321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.012737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.012783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.014924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.014991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.016318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.016367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.017175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.017234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.017665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.017708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.020523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.020583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.022072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.022121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.023063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.023120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.023784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.023830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.026860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.026926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.028672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.028723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.029638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.029699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.030994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.031041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.033649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.033707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.034775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.034825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.035844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.035901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.037742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.037785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.040475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.040534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.040953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.041009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.042137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.042196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.043518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.043566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.046795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.046862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.047287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.047332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.048428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.048486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.050058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.050106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.052691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.052749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.053175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.053231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.054251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.054309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.054729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.054778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.056847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.056913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.057338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.057383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.057404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.057914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.058472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.058529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.058952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.059009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.059030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.059520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.060785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.578 [2024-06-10 16:10:08.061230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.061282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.061698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.062184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.062363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.062794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.062843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.063274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.063753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.064965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.065836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.066189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.067348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.067408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.067475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.067532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.067895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.068067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.068115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.068158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.068199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.068661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.069759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.069812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.069856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.069896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.070357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.070518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.070567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.070609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.070651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.071033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.072968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.073312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.074544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.074597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.074641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.074683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.075826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.076907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.076981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.077806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.078260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.079664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.079715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.079755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.079799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.080754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.081835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.081888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.081929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.081978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.082359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.082526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.082573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.082628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.082671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.579 [2024-06-10 16:10:08.083118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.084968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.085905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.086287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.087969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.088012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.088071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.088549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.089931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.089995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.090995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.091923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.091983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.092850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.093160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.246088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.247625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.249362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.250256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.253283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.254141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.254559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.255251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.257498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.259344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.261290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.262331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.265422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.265860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.266287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.840 [2024-06-10 16:10:08.267701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.269612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.271363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.272494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.274325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.276674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.277107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.277574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.279191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.281169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.282926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.283789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.285267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.286822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.287259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.288463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.289930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.292141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.293509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.295104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.296560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.298165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.298598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.300515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.302229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.304403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.305181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.306675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.308377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.310077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.311092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.312549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.314034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.316033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.317391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.318856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.320346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.322023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.323774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.325324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.326801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.328033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.329783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.331729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.333459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.335633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.337118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.337168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.338638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.340612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.341993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.342043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.343499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.344728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.345175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.345222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.345752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.346188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.347930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:02.841 [2024-06-10 16:10:08.347998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.349823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.351075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.353015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.353073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.353491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.354083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.355341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.355393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.356739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.358095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.359234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.359282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.359697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.360274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.362141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.362189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.364043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.365439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.365869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.365915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.366340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.102 [2024-06-10 16:10:08.366820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.368171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.368223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.369516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.370763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.371202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.371249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.371669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.372153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.373497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.373549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.374206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.375465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.375894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.375939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.376740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.377174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.379034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.379093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.380196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.381563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.382003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.382052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.383520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.384047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.385185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.385237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.387076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.388471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.388902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.388952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.390609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.391104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.391667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.391718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.393110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.394607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.395423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.395477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.396544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.396991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.398160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.398210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.399284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.400579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.402102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.402151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.403269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.403720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.405557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.405614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.407038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.409107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.410893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.410942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.412603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.413125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.414608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.414660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.416515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.418076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.419249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.419301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.420854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.421346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.422452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.422505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.423741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.425250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.426350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.426402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.427312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.427741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.429019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.429071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.429680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.431306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.432833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.432885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.433545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.433984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.435882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.435942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.436367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.437666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.439349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.439401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.440656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.441165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.442561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.442613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.443035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.103 [2024-06-10 16:10:08.444523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.445798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.445846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.446919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.447413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.448695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.448745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.449173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.450690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.451131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.451178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.451595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.452083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.452513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.452563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.452989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.454480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.454918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.454973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.455390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.455884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.456327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.456386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.456803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.458296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.458732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.458783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.459209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.459693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.460139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.460200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.460629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.462244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.462680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.462744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.463170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.463764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.464206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.464270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.464691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.466484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.466922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.466983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.467402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.468066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.468510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.468560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.468989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.470559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.471001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.471054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.471472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.471498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.471970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.472134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.472561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.472611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.473044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.473068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.473599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.475162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.475222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.475638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.475683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.476083] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:03.104 [2024-06-10 16:10:08.476641] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:03.104 [2024-06-10 16:10:08.476714] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:03.104 [2024-06-10 16:10:08.477156] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:03.104 [2024-06-10 16:10:08.477219] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.479853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.481342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.481424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.481480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.481522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.481918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.482093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.482142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.482184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.482239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.483714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.483765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.483806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.483851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.484299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.484474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.104 [2024-06-10 16:10:08.484521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.484562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.484604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.485992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.486969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.488379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.488435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.488476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.488516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.488842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.489015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.489067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.489109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.489150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.491796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.493933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.495942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.497247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.497676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.497722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.498625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.498923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.499090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.500662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.500710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.501973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.503368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.503796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.503841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.505079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.505362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.505520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.506343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.506391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.507989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.509238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.509872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.509919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.511219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.511567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.511738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.513346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.513398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.514791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.516084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.516514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.516561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.516986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.517304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.517460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.519068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.519119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.519789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.521056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.521486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.521531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.521950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.522240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.522404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.524278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.524328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.524914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.105 [2024-06-10 16:10:08.526327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.527337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.527384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.528940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.529365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.529525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.530751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.530802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.532403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.533825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.534785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.534835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.536464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.536895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.537066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.538360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.538406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.538926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.540328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.542149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.542207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.543876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.544239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.544403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.545151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.545199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.547030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.548386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.549742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.549791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.550900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.551191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.551350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.552278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.552328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.553917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.555582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.557191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.557243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.558982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.559392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.559555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.561481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.561542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.563293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.567558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.569183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.569235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.570596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.570933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.571100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.572704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.572755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.573738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.575033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.575483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.575545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.577213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.577496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.577652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.578095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.578152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.579905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.581271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.582359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.582413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.583236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.583522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.583680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.585144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.585192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.585964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.590682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.591837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.591884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.593021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.593303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.593460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.594283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.594335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.596071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.599293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.600654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.600702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.602172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.602475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.602631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.604383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.604431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.106 [2024-06-10 16:10:08.605153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.609886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.610903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.610951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.612523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.612937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.613102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.614793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.614846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.616526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.621267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.622912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.622968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.624893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.625246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.625403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.626140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.626190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.628030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.632233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.633245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.633297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.635027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.635313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.635471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.637267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.637324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.639082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.642689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.644150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.644199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.645669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.645951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.646113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.647070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.647120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.648780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.652480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.367 [2024-06-10 16:10:08.653470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.653520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.654453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.654740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.654897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.656367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.656416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.657891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.662075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.663831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.663879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.664900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.665188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.665354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.666511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.666560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.667321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.671894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.673300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.673348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.674799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.675126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.675280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.677029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.677078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.677656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.681155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.683028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.683077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.683117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.683407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.683557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.685031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.685079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.685119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.688798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.690305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.690463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.690608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.692095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.696084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.697838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.699256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.701016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.701495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.701550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.703221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.703664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.705308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.710933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.712420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.714162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.715161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.715449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.716137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.717615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.718219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.719716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.725097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.726914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.728790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.730044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.730387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.731821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.732551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.733917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.735384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.737852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.739325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.740809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.742551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.742881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.744857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.745407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.746977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.747505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.750462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.751227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.752741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.754453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.754745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.756688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.757943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.758792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.760311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.765723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.767176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.768643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.770136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.770423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.771636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.773352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.774012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.368 [2024-06-10 16:10:08.775464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.780826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.782403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.784136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.786050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.786338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.787655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.788509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.790022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.790627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.796327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.797801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.799288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.801021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.801351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.803328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.803881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.805439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.805940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.811053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.812033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.813457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.813505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.813922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.815454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.816561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.817754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.817806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.823331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.823401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.823820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.823869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.824162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.825697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.825754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.826303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.826354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.832425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.832483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.833305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.833351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.833702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.835632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.835695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.836850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.836896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.842052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.842111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.843513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.843561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.843939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.845781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.845838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.846733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.846782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.850694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.850753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.852262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.852307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.852656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.854538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.854616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.856442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.856489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.861432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.861493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.862330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.862376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.862724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.864187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.864244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.865423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.865469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.871105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.871168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.871592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.871649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.871936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.873379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.873436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.873993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.369 [2024-06-10 16:10:08.874046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.879122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.879188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.881012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.881069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.881420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.883099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.883161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.884802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.884852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.889388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.889447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.890754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.890801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.891135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.892613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.892670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.893566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.893614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.897062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.897147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.897587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.897637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.897925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.899825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.899893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.901149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.901196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.906225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.906284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.907263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.907312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.907634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.909093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.909148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.911049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.911096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.917055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.917122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.918915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.918976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.919263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.920252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.920310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.921424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.921470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.924248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.924313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.925983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.926029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.926363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.927819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.927876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.929220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.929268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.932602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.932662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.933945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.933998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.934375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.936286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.936350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.936884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.936932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.941700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.941760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.943238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.943287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.943665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.945141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.945201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.946434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.946482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.950252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.950312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.951500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.951546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.951836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.952843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.952901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.953329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.953382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.957063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.957130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.957553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.957604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.631 [2024-06-10 16:10:08.957988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.959336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.959392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.960718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.960765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.963841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.963900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.965064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.965110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.965435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.965975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.966038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.966925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.966980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.970170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.970234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.971634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.971679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.972003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.973229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.973286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.973705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.973760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.976828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.976894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.977323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.977389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.977755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.979448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.979503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.980478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.980526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.983138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.983198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.984672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.984716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.985077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.985611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.985672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.986202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.986253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.988916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.988989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.990885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.990932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.991372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.993086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.993140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.993558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.993616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.996210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.996275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.996695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.996754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.997195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.999245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.999308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.999944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:08.999995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.002543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.004443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.004864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.004920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.005311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.005849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.007716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.008149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.008199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.012648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.012723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.012769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.012810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.013101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.013935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.013996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.014039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.014089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.017904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.017967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.018684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.023631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.023691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.023731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.023771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.024152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.024321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.024366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.632 [2024-06-10 16:10:09.024407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.024449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.028433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.028491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.028532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.028573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.028910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.029084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.029132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.029173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.029215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.033973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.034028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.037697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.037750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.037791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.037832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.038188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.038349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.038395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.038437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.038478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.042561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.042615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.042655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.042695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.042990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.043149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.043195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.043236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.043280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.046772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.050597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.050662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.050705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.052029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.052400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.052555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.052602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.052644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.053536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.055576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.057002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.057054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.057631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.057919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.058088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.059878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.059934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.061066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.064084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.065754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.065806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.067132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.067539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.067694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.068939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.068995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.070897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.076142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.076925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.076984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.078600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.078924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.079086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.079941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.079999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.081054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.084791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.085862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.085913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.087009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.087299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.087454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.089131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.089180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.089603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.633 [2024-06-10 16:10:09.092322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.094186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.094240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.095684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.096130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.096286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.097719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.097765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.098820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.102499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.103844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.103906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.105617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.105963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.106123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.106864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.106910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.108681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.113026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.114120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.114171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.115105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.115395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.115554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.116155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.116205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.117514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.121322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.122973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.123025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.124040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.124338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.124496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.125822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.125871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.126455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.130773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.132245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.132295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.133634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.134000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.134158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.135728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.135774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.634 [2024-06-10 16:10:09.136738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.140294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.141413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.141465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.142758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.143054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.143213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.143791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.143838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.145190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.149002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.150651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.150700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.151624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.151918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.152079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.153347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.153395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.154064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.158399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.160075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.160124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.161649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.161971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.162128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.163874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.163923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.165008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.167950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.169452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.169504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.171240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.171539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.171693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.173505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.173561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.175365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.180317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.181559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.181606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.182684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.183023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.183179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.184675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.184724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.186470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.190858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.191802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.191852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.193116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.193457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.193611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.194561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.194612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.196002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.200312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.201804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.201852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.203341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.203631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.203784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.204737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.204788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.205825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.209673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.211456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.211507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.212636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.212927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.213086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.214748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.214801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.216480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.219303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.220238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.220286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.221742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.222075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.222230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.223987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.224037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.225030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.229559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.231153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.231200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.232120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.232415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.232570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.895 [2024-06-10 16:10:09.233620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.233669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.235126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.239407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.240912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.240965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.242689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.243135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.243290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.244709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.244755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.245846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.249627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.251308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.251357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.252747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.253082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.253245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.254742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.254792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.256529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.261760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.263239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.263288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.264769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.265062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.265212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.266713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.266760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.268347] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.272469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.272535] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.272576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.273324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.273618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.273769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.273820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.273861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.275084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.279652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.281126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.282613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.284352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.284700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.284852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.286543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.287161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.288658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.293961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.295475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.297177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.299055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.299343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.300585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.301503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.302912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.303615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.309282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.310753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.312249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.314009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.314354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.316343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.316816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.318456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.318937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.325384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.327255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.329050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.330915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.331309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.332517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.333769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.334628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.335869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.341819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.343315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.345070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.346240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.346538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.347072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.348914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.349342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.351165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.356900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.357833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.359422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.360153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.360442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.360975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.362854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.364509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.364953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.896 [2024-06-10 16:10:09.370912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.371350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.373097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.374867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.375267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.377148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.378960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.380026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.381081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.385671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.387353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.389194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.390258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.390574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.391844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.392739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.393585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.394943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.399634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.400912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.401742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.401790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.402085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:03.897 [2024-06-10 16:10:09.403262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.404549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.406041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.406090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.409943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.410008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.411821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.411874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.412203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.413109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.413172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.414794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.414851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.417832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.417898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.419216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.419278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.419566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.420560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.420616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.421716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.421767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.426671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.426730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.427806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.427855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.428166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.429872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.429929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.431174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.431224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.439187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.439252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.440775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.440823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.441251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.442968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.443030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.444917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.444980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.449997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.450059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.451727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.157 [2024-06-10 16:10:09.451776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.452076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.453286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.453341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.454611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.454662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.458069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.458129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.459053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.459102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.459392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.460819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.460876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.461634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.461684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.465609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.465669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.466392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.466442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.466763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.468772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.468834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.470148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.470194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.475648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.475708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.477283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.477330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.477746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.478886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.478945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.480614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.480660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.486410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.486470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.488072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.488128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.488415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.489379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.489435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.490443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.490489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.496197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.496264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.497256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.497305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.497658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.499050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.499107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.499709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.499753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.504579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.504639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.506398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.506444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.506795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.508721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.508777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.509571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.509619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.514426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.514487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.516112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.516161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.516598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.517984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.518041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.519308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.519356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.522727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.522793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.523775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.523824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.524162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.524697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.524754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.525773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.525824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.529138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.529198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.530570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.530615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.530927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.532150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.532208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.532626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.532683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.535712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.535780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.536400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.536453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.536876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.537420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.537478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.537895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.537950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.540629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.540715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.541146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.541198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.541602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.542150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.542208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.542630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.542696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.545465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.545525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.545943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.546007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.546388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.546920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.546984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.547405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.547453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.550195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.550255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.550676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.550725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.551188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.551724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.551780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.552210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.552261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.554909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.554973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.555392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.555441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.555732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.557060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.557117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.558693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.558743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.562425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.562485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.563573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.563623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.563911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.565445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.565502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.566591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.566640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.571265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.571326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.573153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.573208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.158 [2024-06-10 16:10:09.573531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.574444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.574502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.574921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.574975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.580156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.580217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.580638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.580687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.581035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.581968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.582025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.583073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.583123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.587649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.588345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.589738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.589787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.590079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.591067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.592442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.593880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.593927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.598337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.598396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.598437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.598477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.598826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.599866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.599922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.599970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.600011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.602817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.602871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.602911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.602951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.603281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.603441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.603490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.603545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.603588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.605886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.607987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.608029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.608071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.609504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.609556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.609597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.609642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.610027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.610188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.610235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.610281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.610326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.611711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.611775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.611817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.611864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.612155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.612320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.612381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.612422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.612464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.613927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.613985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.614676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.616783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.618147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.618210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.618255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.619687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.620050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.620206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.620253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.620294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.621659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.623723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.625603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.625656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.627516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.627920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.628088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.629538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.629590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.631122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.632702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.634065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.634118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.635084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.635432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.635585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.636964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.637015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.637593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.639014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.640887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.640947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.641646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.641936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.642105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.643566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.643618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.644041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.645583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.646954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.647010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.647764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.648057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.648213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.649808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.649872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.651562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.653014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.654511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.654559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.655227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.655515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.655668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.656103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.159 [2024-06-10 16:10:09.656161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.656576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.657906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.658348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.658395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.658813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.659108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.659269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.661076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.661126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.661787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.663174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.663604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.663649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.160 [2024-06-10 16:10:09.665558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.665851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.666020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.667731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.667783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.669557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.670882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.672459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.672509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.673770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.674259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.674420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.674982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.675034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.676485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.677775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.678901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.678951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.680452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.680783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.680938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.682426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.682475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.683460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.687021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.688634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.688684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.689843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.690175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.690331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.691822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.691872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.693364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.698639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.700343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.700396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.702066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.702355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.702508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.704372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.704428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.705824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.707082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.708509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.708561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.710218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.710645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.710800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.712163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.712209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.713103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.714349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.716234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.716290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.717701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.718045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.718199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.719685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.719738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.721236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.723254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.724804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.724853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.726412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.726701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.726855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.728646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.728701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.730344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.731672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.732901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.732950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.733371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.733753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.733907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.735363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.735412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.736879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.738145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.739598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.739646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.741117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.741474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.741628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.742785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.421 [2024-06-10 16:10:09.742834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.743258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.744534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.746464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.746524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.748428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.748742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.748893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.750337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.750386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.751858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.753258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.753690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.753739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.755314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.755602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.755757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.757617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.757664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.759493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.760733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.762233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.762283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.763628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.764120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.764274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.764699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.764746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.766255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.767550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.768553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.768603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.770081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.770392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.770553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.772064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.772114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.773368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.775133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.776799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.776848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.778510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.778841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.779013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.779995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.780044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.781492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.782724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.783166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.783213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.783645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.783934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.784091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.785813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.785870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.787622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.788987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.789045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.789086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.790579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.790868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.791028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.791080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.791121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.792310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.794127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.795903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.797507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.798998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.799334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.799487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.800964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.802443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.804092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.807740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.809666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.811006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.811765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.812194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.812783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.814359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.816065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.816754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.821523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.823279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.823980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.825370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.825662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.826700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.827838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.829057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.829966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.422 [2024-06-10 16:10:09.831867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.833185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.835039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.835466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.835851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.837065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.838170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.839675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.841360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.844467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.844898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.845328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.846564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.846974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.848615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.849885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.850975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.852359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.854153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.855500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.856610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.857919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.858259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.859489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.860785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.861213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.861634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.864330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.865455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.866634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.867061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.867483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.869286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.870476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.871367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.873218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.875268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.877074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.878309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.879176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.879468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.880797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.881621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.882047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.882481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.885590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.886926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.887671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.887718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.888082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.888795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.890259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.892113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.892175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.893929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.893999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.894417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.894462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.894809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.896040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.896098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.897355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.897418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.899120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.899182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.899601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.899660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.899948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.901476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.901533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.902100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.902151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.903844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.903900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.904756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.904807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.905148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.906066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.906124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.907173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.907223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.909503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.909563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.911102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.911147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.911437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.912271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.912329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.913516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.913562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.916663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.916721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.918228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.423 [2024-06-10 16:10:09.918275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.918636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.920530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.920594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.922309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.922362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.925988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.926053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.927664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.927711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.424 [2024-06-10 16:10:09.928044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.930012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.930075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.931739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.931786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.934140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.934200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.935373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.935421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.935817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.936362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.936416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.936833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.936878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.938747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.938806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.939239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.939291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.939650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.940189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.940245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.940663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.940706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.942935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.943013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.943435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.943494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.943917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.944459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.944515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.944938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.944998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.946834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.685 [2024-06-10 16:10:09.946900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.947336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.947392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.947786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.948325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.948383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.948803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.948859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.950828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.950894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.951325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.951383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.951765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.952302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.952358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.952778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.952846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.955050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.955116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.955541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.955595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.956051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.956581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.956635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.957064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.957115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.959476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.959536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.959974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.960032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.960479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.961020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.961076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.961497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.961552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.963645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.963703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.964130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.964175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.964467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.965718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.965775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.967281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.967331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.968972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.969030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.969446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.969489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.969778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.971017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.971074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.971801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.971852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.973513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.973571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.974244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.974291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.974606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.976635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.976700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.977794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.977841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.979623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.979681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.981182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.981230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.981606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.982812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.982869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.984797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.984842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.986902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.986976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.988885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.988939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.989234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.989787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.989843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.991421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.991490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.993413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.993477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.994826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.994877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.995171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.996350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.996407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.997747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:09.997796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.000874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.000933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.002282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.002333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.002708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.686 [2024-06-10 16:10:10.004627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.004690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.006409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.006467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.009453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.009530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.011251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.011297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.011624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.013105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.013164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.014468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.014522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.017514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.017586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.019072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.019117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.019480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.021413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.021484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.023355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.023403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.026166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.027826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.028689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.028739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.029072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.030592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.031030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.031464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.031514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.034104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.034164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.034207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.034259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.034553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.035231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.035288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.035334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.035380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.036784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.036838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.036882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.036925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.037298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.037483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.037543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.037596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.037648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.039976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.041968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.042016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.042057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.042099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.043678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.043740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.043793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.043841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.044139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.044304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.044364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.044410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.044456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.045732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.045785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.045826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.045871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.046225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.046391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.046439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.046481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.046524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.047849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.047906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.047947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.047995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.048285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.048446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.048492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.048534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.048575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.687 [2024-06-10 16:10:10.049993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.050708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.052380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.052432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.052472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.052897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.053275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.053434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.053480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.053534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.053965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.055402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.057158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.057214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.059011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.059388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.059550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.061021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.061071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.062807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.064233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.064669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.064719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.065154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.065577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.065736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.066491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.066542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.068043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.069379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.070456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.070524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.070940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.071356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.071518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.073006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.073055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.074658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.076038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.077413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.077461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.079013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.079307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.079468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.080509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.080559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.080981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.082311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.084066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.084122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.085931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.086276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.086437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.087892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.087941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.089416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.090787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.091226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.091277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.092916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.093216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.093381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.095267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.095317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.096909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.098165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.099920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.099979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.101049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.101542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.101709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.102256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.102306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.103766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.104996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.106071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.106120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.107572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.107910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.108079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.109839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.109887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.110690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.112316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.114133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.114183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.688 [2024-06-10 16:10:10.115959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.116253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.116413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.117714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.117763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.119229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.120508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.120938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.120993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.121569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.121862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.122031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.123617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.123666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.125399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.126690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.128185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.128234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.129962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.130308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.130468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.130894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.130939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.131767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.133026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.134773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.134821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.136181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.136516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.136675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.138154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.138202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.139932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.142075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.143686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.143735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.145344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.145638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.145799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.147385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.147432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.148930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.150273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.151398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.151447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.151869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.152283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.152444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.153910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.153965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.155440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.156780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.158260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.158309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.159795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.160096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.160252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.161124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.161173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.161589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.162881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.164746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.164796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.166371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.166667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.166824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.168295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.168343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.169822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.171337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.171777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.171825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.173308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.173600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.173757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.175528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.175578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.177007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.178270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.180021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.180069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.181091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.181586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.181744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.182301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.182350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.183801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.185026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.186041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.186090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.187551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.187924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.689 [2024-06-10 16:10:10.188088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.690 [2024-06-10 16:10:10.189835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.690 [2024-06-10 16:10:10.189884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.690 [2024-06-10 16:10:10.190756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.690 [2024-06-10 16:10:10.192423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.193936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.193993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.194603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.194894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.195067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.196974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.197036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.197452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.198772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.200276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.200329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.201758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.202132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.202297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.203393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.203464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.203880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.205352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.206122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.206176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.207795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.208124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.208278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.208706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.208756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.209179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.950 [2024-06-10 16:10:10.210386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.210459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.210500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.211738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.212144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.212301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.212351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.212393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.213353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.215069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.216319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.218053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.218808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.219105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.219263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.220842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.222578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.223016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.225815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.226335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.228251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.229838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.230258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.230784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.231252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.232871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.234752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.236745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.237181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.237602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.239340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.239631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.240221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.242022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.243716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.244145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.246919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.247444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.249328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.251041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.251408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.251934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.252477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.254032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.255914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.257706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.258142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.258654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.260244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.260541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.261078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.262696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.264650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.265079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.268085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.268572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.270195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.272069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.272464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.272999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.273769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.275122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.276766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.278443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.278870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.279647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.281007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.281299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.282161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.283517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.285126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.285543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.287481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.288938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.289368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.289790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.290088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.291188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.292821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.294043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.295030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.297010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.298484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.300230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.300278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.300571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.301858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.303334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.305086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.305135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.308035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.308094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.309073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.951 [2024-06-10 16:10:10.309118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.309488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.310678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.310734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.311155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.311200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.314040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.314099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.315974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.316019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.316362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.317915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.317981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.319463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.319509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.321227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.321304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.322984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.323037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.323334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.324066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.324124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.325969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.326013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.327884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.327949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.328397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.328454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.328903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.329444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.329498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.329920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.329984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.331920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.331994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.332416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.332467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.332872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.333414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.333469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.333887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.333963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.336095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.336162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.336587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.336636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.337084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.337620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.337673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.338107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.338158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.340285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.340350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.340769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.340832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.341260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.341795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.341849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.342279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.342350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.344318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.344381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.344804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.344853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.345291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.345830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.345884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.346315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.346367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.348409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.348487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.348911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.348968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.349409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.349946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.350009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.350430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.350481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.353005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.353082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.353498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.353543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.353869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.355657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.355722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.357461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.357507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.359151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.359208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.359627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.359671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.360039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.361505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.361562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.952 [2024-06-10 16:10:10.362491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.362549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.364202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.364261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.364680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.364738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.365034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.366870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.366936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.367582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.367636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.369370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.369428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.370468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.370519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.370889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.372257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.372312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.373767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.373816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.375639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.375697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.376962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.377013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.377433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.378805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.378862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.379467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.379517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.382467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.382526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.383330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.383379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.383731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.385653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.385711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.386132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.386176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.388998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.389057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.390674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.390721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.391101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.392209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.392268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.392684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.392727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.394657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.394716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.396211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.396262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.396552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.397105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.397163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.397582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.397632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.398976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.399050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:31:04.953 [2024-06-10 16:10:10.401731] accel_dpdk_cryptodev.c: 476:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get dst_mbufs! 00:31:05.521 00:31:05.521 Latency(us) 00:31:05.521 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:05.521 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:05.521 Verification LBA range: start 0x0 length 0x100 00:31:05.521 crypto_ram : 6.01 42.25 2.64 0.00 0.00 2928125.87 71902.35 2908050.77 00:31:05.521 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:05.521 Verification LBA range: start 0x100 length 0x100 00:31:05.521 crypto_ram : 5.99 42.74 2.67 0.00 0.00 2898772.36 80390.83 2892072.47 00:31:05.521 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:05.521 Verification LBA range: start 0x0 length 0x100 00:31:05.521 crypto_ram2 : 6.01 42.56 2.66 0.00 0.00 2794318.51 70903.71 2908050.77 00:31:05.521 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:05.521 Verification LBA range: start 0x100 length 0x100 00:31:05.522 crypto_ram2 : 5.99 42.73 2.67 0.00 0.00 2778188.07 79891.50 2828159.27 00:31:05.522 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:05.522 Verification LBA range: start 0x0 length 0x100 00:31:05.522 crypto_ram3 : 5.69 243.80 15.24 0.00 0.00 461789.24 71902.35 663099.49 00:31:05.522 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:05.522 Verification LBA range: start 0x100 length 0x100 00:31:05.522 crypto_ram3 : 5.67 249.42 15.59 0.00 0.00 450614.03 15603.81 659104.91 00:31:05.522 Job: crypto_ram4 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:05.522 Verification LBA range: start 0x0 length 0x100 00:31:05.522 crypto_ram4 : 5.78 259.84 16.24 0.00 0.00 419290.87 17975.59 559240.53 00:31:05.522 Job: crypto_ram4 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:05.522 Verification LBA range: start 0x100 length 0x100 00:31:05.522 crypto_ram4 : 5.77 266.06 16.63 0.00 0.00 409551.68 14917.24 563235.11 00:31:05.522 =================================================================================================================== 00:31:05.522 Total : 1189.40 74.34 0.00 0.00 794243.11 14917.24 2908050.77 00:31:06.090 00:31:06.090 real 0m9.107s 00:31:06.090 user 0m17.420s 00:31:06.090 sys 0m0.340s 00:31:06.090 16:10:11 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:06.090 16:10:11 blockdev_crypto_aesni.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:06.090 ************************************ 00:31:06.090 END TEST bdev_verify_big_io 00:31:06.090 ************************************ 00:31:06.090 16:10:11 blockdev_crypto_aesni -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:06.090 16:10:11 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:06.090 16:10:11 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:06.090 16:10:11 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:06.090 ************************************ 00:31:06.090 START TEST bdev_write_zeroes 00:31:06.090 ************************************ 00:31:06.090 16:10:11 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:06.090 [2024-06-10 16:10:11.418882] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:06.090 [2024-06-10 16:10:11.418933] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2864190 ] 00:31:06.090 [2024-06-10 16:10:11.517951] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:06.349 [2024-06-10 16:10:11.609365] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:06.349 [2024-06-10 16:10:11.630649] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_aesni_mb 00:31:06.349 [2024-06-10 16:10:11.638683] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:31:06.349 [2024-06-10 16:10:11.646696] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:31:06.349 [2024-06-10 16:10:11.751585] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 97 00:31:08.883 [2024-06-10 16:10:13.945305] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_1" 00:31:08.883 [2024-06-10 16:10:13.945377] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:08.883 [2024-06-10 16:10:13.945390] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.883 [2024-06-10 16:10:13.953324] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_2" 00:31:08.883 [2024-06-10 16:10:13.953342] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:08.883 [2024-06-10 16:10:13.953351] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.883 [2024-06-10 16:10:13.961344] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_3" 00:31:08.883 [2024-06-10 16:10:13.961361] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:31:08.883 [2024-06-10 16:10:13.961375] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.883 [2024-06-10 16:10:13.969365] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_aesni_cbc_4" 00:31:08.883 [2024-06-10 16:10:13.969382] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:31:08.883 [2024-06-10 16:10:13.969391] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:08.883 Running I/O for 1 seconds... 00:31:09.820 00:31:09.820 Latency(us) 00:31:09.820 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:09.820 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:09.820 crypto_ram : 1.03 1788.06 6.98 0.00 0.00 70935.16 5929.45 85384.05 00:31:09.820 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:09.820 crypto_ram2 : 1.03 1801.35 7.04 0.00 0.00 70078.53 5929.45 79392.18 00:31:09.820 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:09.820 crypto_ram3 : 1.02 13696.98 53.50 0.00 0.00 9187.39 2715.06 11921.31 00:31:09.820 Job: crypto_ram4 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:09.820 crypto_ram4 : 1.02 13734.65 53.65 0.00 0.00 9131.81 2715.06 9549.53 00:31:09.820 =================================================================================================================== 00:31:09.820 Total : 31021.04 121.18 0.00 0.00 16291.02 2715.06 85384.05 00:31:10.079 00:31:10.079 real 0m4.074s 00:31:10.079 user 0m3.735s 00:31:10.079 sys 0m0.295s 00:31:10.079 16:10:15 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:10.079 16:10:15 blockdev_crypto_aesni.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:10.079 ************************************ 00:31:10.079 END TEST bdev_write_zeroes 00:31:10.079 ************************************ 00:31:10.080 16:10:15 blockdev_crypto_aesni -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:10.080 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:10.080 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:10.080 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:10.080 ************************************ 00:31:10.080 START TEST bdev_json_nonenclosed 00:31:10.080 ************************************ 00:31:10.080 16:10:15 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:10.080 [2024-06-10 16:10:15.556518] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:10.080 [2024-06-10 16:10:15.556571] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2864795 ] 00:31:10.338 [2024-06-10 16:10:15.654918] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:10.338 [2024-06-10 16:10:15.747142] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:10.338 [2024-06-10 16:10:15.747209] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:10.338 [2024-06-10 16:10:15.747225] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:10.338 [2024-06-10 16:10:15.747234] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:10.338 00:31:10.338 real 0m0.341s 00:31:10.338 user 0m0.239s 00:31:10.338 sys 0m0.100s 00:31:10.338 16:10:15 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:10.338 16:10:15 blockdev_crypto_aesni.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:10.338 ************************************ 00:31:10.338 END TEST bdev_json_nonenclosed 00:31:10.338 ************************************ 00:31:10.597 16:10:15 blockdev_crypto_aesni -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:10.597 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:10.597 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:10.597 16:10:15 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:10.597 ************************************ 00:31:10.597 START TEST bdev_json_nonarray 00:31:10.597 ************************************ 00:31:10.597 16:10:15 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:10.597 [2024-06-10 16:10:15.955945] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:10.597 [2024-06-10 16:10:15.956000] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2864970 ] 00:31:10.597 [2024-06-10 16:10:16.053750] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:10.856 [2024-06-10 16:10:16.144449] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:10.856 [2024-06-10 16:10:16.144523] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:10.856 [2024-06-10 16:10:16.144540] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:10.856 [2024-06-10 16:10:16.144549] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:10.856 00:31:10.856 real 0m0.336s 00:31:10.856 user 0m0.213s 00:31:10.856 sys 0m0.121s 00:31:10.856 16:10:16 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:10.856 16:10:16 blockdev_crypto_aesni.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:10.856 ************************************ 00:31:10.856 END TEST bdev_json_nonarray 00:31:10.856 ************************************ 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@787 -- # [[ crypto_aesni == bdev ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@794 -- # [[ crypto_aesni == gpt ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@798 -- # [[ crypto_aesni == crypto_sw ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@811 -- # cleanup 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@26 -- # [[ crypto_aesni == rbd ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@30 -- # [[ crypto_aesni == daos ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@34 -- # [[ crypto_aesni = \g\p\t ]] 00:31:10.856 16:10:16 blockdev_crypto_aesni -- bdev/blockdev.sh@40 -- # [[ crypto_aesni == xnvme ]] 00:31:10.856 00:31:10.856 real 1m10.949s 00:31:10.856 user 2m49.769s 00:31:10.856 sys 0m7.017s 00:31:10.856 16:10:16 blockdev_crypto_aesni -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:10.856 16:10:16 blockdev_crypto_aesni -- common/autotest_common.sh@10 -- # set +x 00:31:10.856 ************************************ 00:31:10.856 END TEST blockdev_crypto_aesni 00:31:10.856 ************************************ 00:31:10.856 16:10:16 -- spdk/autotest.sh@358 -- # run_test blockdev_crypto_sw /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:10.856 16:10:16 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:31:10.856 16:10:16 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:10.856 16:10:16 -- common/autotest_common.sh@10 -- # set +x 00:31:10.856 ************************************ 00:31:10.856 START TEST blockdev_crypto_sw 00:31:10.856 ************************************ 00:31:10.856 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_sw 00:31:11.116 * Looking for test storage... 00:31:11.116 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/nbd_common.sh@6 -- # set -e 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@20 -- # : 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # uname -s 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@682 -- # test_type=crypto_sw 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@683 -- # crypto_device= 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@684 -- # dek= 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@685 -- # env_ctx= 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == bdev ]] 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@690 -- # [[ crypto_sw == crypto_* ]] 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2865041 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:31:11.116 16:10:16 blockdev_crypto_sw -- bdev/blockdev.sh@49 -- # waitforlisten 2865041 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@830 -- # '[' -z 2865041 ']' 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:11.116 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:11.116 16:10:16 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:11.116 [2024-06-10 16:10:16.516745] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:11.116 [2024-06-10 16:10:16.516804] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2865041 ] 00:31:11.116 [2024-06-10 16:10:16.617534] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:11.375 [2024-06-10 16:10:16.714630] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@863 -- # return 0 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@711 -- # setup_crypto_sw_conf 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@193 -- # rpc_cmd 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.312 Malloc0 00:31:12.312 Malloc1 00:31:12.312 true 00:31:12.312 true 00:31:12.312 true 00:31:12.312 [2024-06-10 16:10:17.720679] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:12.312 crypto_ram 00:31:12.312 [2024-06-10 16:10:17.728706] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:12.312 crypto_ram2 00:31:12.312 [2024-06-10 16:10:17.736744] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:12.312 crypto_ram3 00:31:12.312 [ 00:31:12.312 { 00:31:12.312 "name": "Malloc1", 00:31:12.312 "aliases": [ 00:31:12.312 "24ade041-0b38-4c0a-9175-3e90a92a493d" 00:31:12.312 ], 00:31:12.312 "product_name": "Malloc disk", 00:31:12.312 "block_size": 4096, 00:31:12.312 "num_blocks": 4096, 00:31:12.312 "uuid": "24ade041-0b38-4c0a-9175-3e90a92a493d", 00:31:12.312 "assigned_rate_limits": { 00:31:12.312 "rw_ios_per_sec": 0, 00:31:12.312 "rw_mbytes_per_sec": 0, 00:31:12.312 "r_mbytes_per_sec": 0, 00:31:12.312 "w_mbytes_per_sec": 0 00:31:12.312 }, 00:31:12.312 "claimed": true, 00:31:12.312 "claim_type": "exclusive_write", 00:31:12.312 "zoned": false, 00:31:12.312 "supported_io_types": { 00:31:12.312 "read": true, 00:31:12.312 "write": true, 00:31:12.312 "unmap": true, 00:31:12.312 "write_zeroes": true, 00:31:12.312 "flush": true, 00:31:12.312 "reset": true, 00:31:12.312 "compare": false, 00:31:12.312 "compare_and_write": false, 00:31:12.312 "abort": true, 00:31:12.312 "nvme_admin": false, 00:31:12.312 "nvme_io": false 00:31:12.312 }, 00:31:12.312 "memory_domains": [ 00:31:12.312 { 00:31:12.312 "dma_device_id": "system", 00:31:12.312 "dma_device_type": 1 00:31:12.312 }, 00:31:12.312 { 00:31:12.312 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:12.312 "dma_device_type": 2 00:31:12.312 } 00:31:12.312 ], 00:31:12.312 "driver_specific": {} 00:31:12.312 } 00:31:12.312 ] 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # cat 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.312 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.312 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # jq -r .name 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "55f85497-3aef-549e-8b5c-8a4955286de6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "55f85497-3aef-549e-8b5c-8a4955286de6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "15b1d820-c7a0-5527-b746-97cdcd8ba3c2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "15b1d820-c7a0-5527-b746-97cdcd8ba3c2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:31:12.571 16:10:17 blockdev_crypto_sw -- bdev/blockdev.sh@754 -- # killprocess 2865041 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@949 -- # '[' -z 2865041 ']' 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@953 -- # kill -0 2865041 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # uname 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2865041 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2865041' 00:31:12.571 killing process with pid 2865041 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@968 -- # kill 2865041 00:31:12.571 16:10:17 blockdev_crypto_sw -- common/autotest_common.sh@973 -- # wait 2865041 00:31:12.830 16:10:18 blockdev_crypto_sw -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:12.830 16:10:18 blockdev_crypto_sw -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:12.830 16:10:18 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:31:12.830 16:10:18 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:12.830 16:10:18 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:12.830 ************************************ 00:31:12.830 START TEST bdev_hello_world 00:31:12.830 ************************************ 00:31:13.088 16:10:18 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:31:13.088 [2024-06-10 16:10:18.370483] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:13.088 [2024-06-10 16:10:18.370518] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2865341 ] 00:31:13.088 [2024-06-10 16:10:18.455751] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:13.088 [2024-06-10 16:10:18.546706] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:13.346 [2024-06-10 16:10:18.713438] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:13.346 [2024-06-10 16:10:18.713505] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:13.346 [2024-06-10 16:10:18.713518] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:13.346 [2024-06-10 16:10:18.721458] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:13.346 [2024-06-10 16:10:18.721475] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:13.346 [2024-06-10 16:10:18.721484] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:13.346 [2024-06-10 16:10:18.729479] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:13.346 [2024-06-10 16:10:18.729495] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:13.346 [2024-06-10 16:10:18.729504] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:13.346 [2024-06-10 16:10:18.769725] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:31:13.346 [2024-06-10 16:10:18.769754] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:31:13.346 [2024-06-10 16:10:18.769770] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:31:13.346 [2024-06-10 16:10:18.771692] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:31:13.346 [2024-06-10 16:10:18.771763] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:31:13.347 [2024-06-10 16:10:18.771777] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:31:13.347 [2024-06-10 16:10:18.771809] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:31:13.347 00:31:13.347 [2024-06-10 16:10:18.771825] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:31:13.607 00:31:13.607 real 0m0.634s 00:31:13.607 user 0m0.452s 00:31:13.607 sys 0m0.169s 00:31:13.607 16:10:18 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:13.607 16:10:18 blockdev_crypto_sw.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:31:13.607 ************************************ 00:31:13.607 END TEST bdev_hello_world 00:31:13.607 ************************************ 00:31:13.607 16:10:19 blockdev_crypto_sw -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:31:13.607 16:10:19 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:31:13.607 16:10:19 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:13.607 16:10:19 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:13.607 ************************************ 00:31:13.607 START TEST bdev_bounds 00:31:13.607 ************************************ 00:31:13.607 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:31:13.607 16:10:19 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2865531 00:31:13.607 16:10:19 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:31:13.607 16:10:19 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:13.607 16:10:19 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2865531' 00:31:13.607 Process bdevio pid: 2865531 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2865531 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 2865531 ']' 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:13.608 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:13.608 16:10:19 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:13.608 [2024-06-10 16:10:19.097807] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:13.608 [2024-06-10 16:10:19.097863] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2865531 ] 00:31:13.933 [2024-06-10 16:10:19.196094] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:31:13.933 [2024-06-10 16:10:19.292922] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:31:13.933 [2024-06-10 16:10:19.293019] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:31:13.933 [2024-06-10 16:10:19.293025] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:14.191 [2024-06-10 16:10:19.455077] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:14.191 [2024-06-10 16:10:19.455140] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:14.191 [2024-06-10 16:10:19.455152] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:14.191 [2024-06-10 16:10:19.463099] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:14.191 [2024-06-10 16:10:19.463116] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:14.191 [2024-06-10 16:10:19.463124] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:14.191 [2024-06-10 16:10:19.471121] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:14.191 [2024-06-10 16:10:19.471136] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:14.191 [2024-06-10 16:10:19.471145] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:14.758 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:14.758 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:31:14.758 16:10:20 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:31:14.758 I/O targets: 00:31:14.758 crypto_ram: 32768 blocks of 512 bytes (16 MiB) 00:31:14.758 crypto_ram3: 4096 blocks of 4096 bytes (16 MiB) 00:31:14.758 00:31:14.758 00:31:14.758 CUnit - A unit testing framework for C - Version 2.1-3 00:31:14.758 http://cunit.sourceforge.net/ 00:31:14.758 00:31:14.758 00:31:14.758 Suite: bdevio tests on: crypto_ram3 00:31:14.758 Test: blockdev write read block ...passed 00:31:14.758 Test: blockdev write zeroes read block ...passed 00:31:14.758 Test: blockdev write zeroes read no split ...passed 00:31:14.758 Test: blockdev write zeroes read split ...passed 00:31:14.758 Test: blockdev write zeroes read split partial ...passed 00:31:14.758 Test: blockdev reset ...passed 00:31:14.758 Test: blockdev write read 8 blocks ...passed 00:31:14.758 Test: blockdev write read size > 128k ...passed 00:31:14.758 Test: blockdev write read invalid size ...passed 00:31:14.758 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:14.758 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:14.758 Test: blockdev write read max offset ...passed 00:31:14.758 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:14.758 Test: blockdev writev readv 8 blocks ...passed 00:31:14.758 Test: blockdev writev readv 30 x 1block ...passed 00:31:14.758 Test: blockdev writev readv block ...passed 00:31:14.758 Test: blockdev writev readv size > 128k ...passed 00:31:14.758 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:14.758 Test: blockdev comparev and writev ...passed 00:31:14.758 Test: blockdev nvme passthru rw ...passed 00:31:14.758 Test: blockdev nvme passthru vendor specific ...passed 00:31:14.758 Test: blockdev nvme admin passthru ...passed 00:31:14.758 Test: blockdev copy ...passed 00:31:14.758 Suite: bdevio tests on: crypto_ram 00:31:14.758 Test: blockdev write read block ...passed 00:31:14.758 Test: blockdev write zeroes read block ...passed 00:31:14.758 Test: blockdev write zeroes read no split ...passed 00:31:14.758 Test: blockdev write zeroes read split ...passed 00:31:14.758 Test: blockdev write zeroes read split partial ...passed 00:31:14.758 Test: blockdev reset ...passed 00:31:14.758 Test: blockdev write read 8 blocks ...passed 00:31:14.758 Test: blockdev write read size > 128k ...passed 00:31:14.758 Test: blockdev write read invalid size ...passed 00:31:14.758 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:31:14.758 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:31:14.758 Test: blockdev write read max offset ...passed 00:31:14.758 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:31:14.759 Test: blockdev writev readv 8 blocks ...passed 00:31:14.759 Test: blockdev writev readv 30 x 1block ...passed 00:31:14.759 Test: blockdev writev readv block ...passed 00:31:14.759 Test: blockdev writev readv size > 128k ...passed 00:31:14.759 Test: blockdev writev readv size > 128k in two iovs ...passed 00:31:14.759 Test: blockdev comparev and writev ...passed 00:31:14.759 Test: blockdev nvme passthru rw ...passed 00:31:14.759 Test: blockdev nvme passthru vendor specific ...passed 00:31:14.759 Test: blockdev nvme admin passthru ...passed 00:31:14.759 Test: blockdev copy ...passed 00:31:14.759 00:31:14.759 Run Summary: Type Total Ran Passed Failed Inactive 00:31:14.759 suites 2 2 n/a 0 0 00:31:14.759 tests 46 46 46 0 0 00:31:14.759 asserts 260 260 260 0 n/a 00:31:14.759 00:31:14.759 Elapsed time = 0.084 seconds 00:31:14.759 0 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2865531 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 2865531 ']' 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 2865531 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2865531 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2865531' 00:31:14.759 killing process with pid 2865531 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@968 -- # kill 2865531 00:31:14.759 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@973 -- # wait 2865531 00:31:15.016 16:10:20 blockdev_crypto_sw.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:31:15.016 00:31:15.016 real 0m1.425s 00:31:15.016 user 0m3.860s 00:31:15.016 sys 0m0.329s 00:31:15.016 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:15.016 16:10:20 blockdev_crypto_sw.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:31:15.017 ************************************ 00:31:15.017 END TEST bdev_bounds 00:31:15.017 ************************************ 00:31:15.017 16:10:20 blockdev_crypto_sw -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:15.017 16:10:20 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:31:15.017 16:10:20 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:15.017 16:10:20 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:15.275 ************************************ 00:31:15.275 START TEST bdev_nbd 00:31:15.275 ************************************ 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram3' '' 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram3') 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=2 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=2 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2865795 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2865795 /var/tmp/spdk-nbd.sock 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 2865795 ']' 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:31:15.275 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:15.275 16:10:20 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:15.275 [2024-06-10 16:10:20.595152] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:15.275 [2024-06-10 16:10:20.595204] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:31:15.275 [2024-06-10 16:10:20.686843] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:15.275 [2024-06-10 16:10:20.780246] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:15.534 [2024-06-10 16:10:20.941066] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:15.534 [2024-06-10 16:10:20.941132] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:15.534 [2024-06-10 16:10:20.941144] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.534 [2024-06-10 16:10:20.949085] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:15.534 [2024-06-10 16:10:20.949102] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:15.534 [2024-06-10 16:10:20.949116] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:15.534 [2024-06-10 16:10:20.957107] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:15.534 [2024-06-10 16:10:20.957122] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:15.534 [2024-06-10 16:10:20.957131] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:16.100 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:16.358 1+0 records in 00:31:16.358 1+0 records out 00:31:16.358 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000216785 s, 18.9 MB/s 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:16.358 16:10:21 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:16.617 1+0 records in 00:31:16.617 1+0 records out 00:31:16.617 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000271041 s, 15.1 MB/s 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:31:16.617 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 2 )) 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:31:16.876 { 00:31:16.876 "nbd_device": "/dev/nbd0", 00:31:16.876 "bdev_name": "crypto_ram" 00:31:16.876 }, 00:31:16.876 { 00:31:16.876 "nbd_device": "/dev/nbd1", 00:31:16.876 "bdev_name": "crypto_ram3" 00:31:16.876 } 00:31:16.876 ]' 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:31:16.876 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:31:16.876 { 00:31:16.876 "nbd_device": "/dev/nbd0", 00:31:16.876 "bdev_name": "crypto_ram" 00:31:16.876 }, 00:31:16.876 { 00:31:16.876 "nbd_device": "/dev/nbd1", 00:31:16.876 "bdev_name": "crypto_ram3" 00:31:16.876 } 00:31:16.876 ]' 00:31:16.877 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:17.139 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:17.397 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.655 16:10:22 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram3' '/dev/nbd0 /dev/nbd1' 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram3') 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:17.914 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:31:18.172 /dev/nbd0 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:18.172 1+0 records in 00:31:18.172 1+0 records out 00:31:18.172 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000267669 s, 15.3 MB/s 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:18.172 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd1 00:31:18.432 /dev/nbd1 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:31:18.432 1+0 records in 00:31:18.432 1+0 records out 00:31:18.432 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000272656 s, 15.0 MB/s 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 2 )) 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:18.432 16:10:23 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:31:18.691 { 00:31:18.691 "nbd_device": "/dev/nbd0", 00:31:18.691 "bdev_name": "crypto_ram" 00:31:18.691 }, 00:31:18.691 { 00:31:18.691 "nbd_device": "/dev/nbd1", 00:31:18.691 "bdev_name": "crypto_ram3" 00:31:18.691 } 00:31:18.691 ]' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:31:18.691 { 00:31:18.691 "nbd_device": "/dev/nbd0", 00:31:18.691 "bdev_name": "crypto_ram" 00:31:18.691 }, 00:31:18.691 { 00:31:18.691 "nbd_device": "/dev/nbd1", 00:31:18.691 "bdev_name": "crypto_ram3" 00:31:18.691 } 00:31:18.691 ]' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:31:18.691 /dev/nbd1' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:31:18.691 /dev/nbd1' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=2 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 2 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=2 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 2 -ne 2 ']' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' write 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:31:18.691 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:31:18.691 256+0 records in 00:31:18.691 256+0 records out 00:31:18.692 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0103249 s, 102 MB/s 00:31:18.692 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:18.692 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:31:18.951 256+0 records in 00:31:18.951 256+0 records out 00:31:18.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0217765 s, 48.2 MB/s 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:31:18.951 256+0 records in 00:31:18.951 256+0 records out 00:31:18.951 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0319685 s, 32.8 MB/s 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1' verify 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:18.951 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:19.210 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:19.469 16:10:24 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1' 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1') 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:31:19.728 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:31:19.986 malloc_lvol_verify 00:31:19.987 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:31:20.244 00277093-9c11-4c12-ad2d-66b148d7557e 00:31:20.244 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:31:20.503 d2e17746-7f62-4144-949f-d6e79b5d95d4 00:31:20.503 16:10:25 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:31:20.762 /dev/nbd0 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:31:20.762 mke2fs 1.46.5 (30-Dec-2021) 00:31:20.762 Discarding device blocks: 0/4096 done 00:31:20.762 Creating filesystem with 4096 1k blocks and 1024 inodes 00:31:20.762 00:31:20.762 Allocating group tables: 0/1 done 00:31:20.762 Writing inode tables: 0/1 done 00:31:20.762 Creating journal (1024 blocks): done 00:31:20.762 Writing superblocks and filesystem accounting information: 0/1 done 00:31:20.762 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:31:20.762 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2865795 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 2865795 ']' 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 2865795 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2865795 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2865795' 00:31:21.021 killing process with pid 2865795 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@968 -- # kill 2865795 00:31:21.021 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@973 -- # wait 2865795 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:31:21.280 00:31:21.280 real 0m6.142s 00:31:21.280 user 0m9.380s 00:31:21.280 sys 0m1.900s 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:31:21.280 ************************************ 00:31:21.280 END TEST bdev_nbd 00:31:21.280 ************************************ 00:31:21.280 16:10:26 blockdev_crypto_sw -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:31:21.280 16:10:26 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = nvme ']' 00:31:21.280 16:10:26 blockdev_crypto_sw -- bdev/blockdev.sh@764 -- # '[' crypto_sw = gpt ']' 00:31:21.280 16:10:26 blockdev_crypto_sw -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:31:21.280 16:10:26 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:31:21.280 16:10:26 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:21.280 16:10:26 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:21.280 ************************************ 00:31:21.280 START TEST bdev_fio 00:31:21.280 ************************************ 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:31:21.280 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:31:21.280 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:21.539 ************************************ 00:31:21.539 START TEST bdev_fio_rw_verify 00:31:21.539 ************************************ 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:21.539 16:10:26 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:21.798 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.798 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:21.798 fio-3.35 00:31:21.798 Starting 2 threads 00:31:34.007 00:31:34.007 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2867178: Mon Jun 10 16:10:37 2024 00:31:34.007 read: IOPS=20.2k, BW=78.9MiB/s (82.7MB/s)(789MiB/10001msec) 00:31:34.007 slat (usec): min=15, max=416, avg=21.58, stdev= 3.72 00:31:34.007 clat (usec): min=8, max=612, avg=157.33, stdev=62.49 00:31:34.007 lat (usec): min=27, max=634, avg=178.91, stdev=63.99 00:31:34.007 clat percentiles (usec): 00:31:34.007 | 50.000th=[ 155], 99.000th=[ 297], 99.900th=[ 318], 99.990th=[ 367], 00:31:34.007 | 99.999th=[ 570] 00:31:34.007 write: IOPS=24.3k, BW=94.8MiB/s (99.4MB/s)(899MiB/9477msec); 0 zone resets 00:31:34.007 slat (usec): min=15, max=294, avg=36.24, stdev= 4.38 00:31:34.007 clat (usec): min=28, max=863, avg=210.40, stdev=95.94 00:31:34.007 lat (usec): min=59, max=954, avg=246.64, stdev=97.61 00:31:34.007 clat percentiles (usec): 00:31:34.007 | 50.000th=[ 204], 99.000th=[ 416], 99.900th=[ 441], 99.990th=[ 644], 00:31:34.007 | 99.999th=[ 840] 00:31:34.007 bw ( KiB/s): min=87096, max=98128, per=94.82%, avg=92077.89, stdev=1470.23, samples=38 00:31:34.007 iops : min=21772, max=24532, avg=23019.47, stdev=367.73, samples=38 00:31:34.007 lat (usec) : 10=0.01%, 20=0.01%, 50=2.12%, 100=14.54%, 250=60.33% 00:31:34.007 lat (usec) : 500=22.99%, 750=0.02%, 1000=0.01% 00:31:34.007 cpu : usr=99.62%, sys=0.01%, ctx=26, majf=0, minf=467 00:31:34.007 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:34.007 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:34.007 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:34.007 issued rwts: total=202005,230072,0,0 short=0,0,0,0 dropped=0,0,0,0 00:31:34.007 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:34.007 00:31:34.007 Run status group 0 (all jobs): 00:31:34.007 READ: bw=78.9MiB/s (82.7MB/s), 78.9MiB/s-78.9MiB/s (82.7MB/s-82.7MB/s), io=789MiB (827MB), run=10001-10001msec 00:31:34.007 WRITE: bw=94.8MiB/s (99.4MB/s), 94.8MiB/s-94.8MiB/s (99.4MB/s-99.4MB/s), io=899MiB (942MB), run=9477-9477msec 00:31:34.007 00:31:34.007 real 0m11.083s 00:31:34.007 user 0m28.441s 00:31:34.007 sys 0m0.332s 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:31:34.007 ************************************ 00:31:34.007 END TEST bdev_fio_rw_verify 00:31:34.007 ************************************ 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:34.007 16:10:37 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "55f85497-3aef-549e-8b5c-8a4955286de6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "55f85497-3aef-549e-8b5c-8a4955286de6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "15b1d820-c7a0-5527-b746-97cdcd8ba3c2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "15b1d820-c7a0-5527-b746-97cdcd8ba3c2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:34.007 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:31:34.007 crypto_ram3 ]] 00:31:34.007 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:31:34.007 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "55f85497-3aef-549e-8b5c-8a4955286de6"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 32768,' ' "uuid": "55f85497-3aef-549e-8b5c-8a4955286de6",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_sw"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "15b1d820-c7a0-5527-b746-97cdcd8ba3c2"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 4096,' ' "uuid": "15b1d820-c7a0-5527-b746-97cdcd8ba3c2",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "crypto_ram2",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_sw3"' ' }' ' }' '}' 00:31:34.007 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:34.008 ************************************ 00:31:34.008 START TEST bdev_fio_trim 00:31:34.008 ************************************ 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:31:34.008 16:10:38 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:31:34.008 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:34.008 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:31:34.008 fio-3.35 00:31:34.008 Starting 2 threads 00:31:43.987 00:31:43.987 job_crypto_ram: (groupid=0, jobs=2): err= 0: pid=2869113: Mon Jun 10 16:10:48 2024 00:31:43.987 write: IOPS=41.8k, BW=163MiB/s (171MB/s)(1632MiB/10001msec); 0 zone resets 00:31:43.987 slat (usec): min=9, max=373, avg=20.72, stdev= 9.18 00:31:43.987 clat (usec): min=24, max=2069, avg=157.23, stdev=123.93 00:31:43.987 lat (usec): min=33, max=2091, avg=177.96, stdev=131.86 00:31:43.987 clat percentiles (usec): 00:31:43.987 | 50.000th=[ 97], 99.000th=[ 545], 99.900th=[ 644], 99.990th=[ 668], 00:31:43.987 | 99.999th=[ 750] 00:31:43.987 bw ( KiB/s): min=99976, max=193440, per=99.34%, avg=166001.68, stdev=20279.07, samples=38 00:31:43.987 iops : min=24994, max=48360, avg=41500.42, stdev=5069.77, samples=38 00:31:43.987 trim: IOPS=41.8k, BW=163MiB/s (171MB/s)(1632MiB/10001msec); 0 zone resets 00:31:43.987 slat (usec): min=3, max=327, avg= 9.76, stdev= 4.69 00:31:43.987 clat (usec): min=34, max=2091, avg=104.54, stdev=43.96 00:31:43.987 lat (usec): min=38, max=2101, avg=114.30, stdev=46.82 00:31:43.987 clat percentiles (usec): 00:31:43.987 | 50.000th=[ 98], 99.000th=[ 249], 99.900th=[ 322], 99.990th=[ 416], 00:31:43.987 | 99.999th=[ 570] 00:31:43.987 bw ( KiB/s): min=99976, max=193440, per=99.34%, avg=166002.95, stdev=20279.22, samples=38 00:31:43.987 iops : min=24994, max=48360, avg=41500.74, stdev=5069.81, samples=38 00:31:43.987 lat (usec) : 50=10.93%, 100=41.30%, 250=32.53%, 500=14.41%, 750=0.83% 00:31:43.987 lat (msec) : 2=0.01%, 4=0.01% 00:31:43.987 cpu : usr=99.64%, sys=0.00%, ctx=36, majf=0, minf=298 00:31:43.987 IO depths : 1=8.5%, 2=18.9%, 4=58.1%, 8=14.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:31:43.987 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:43.987 complete : 0=0.0%, 4=87.3%, 8=12.7%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:31:43.987 issued rwts: total=0,417813,417813,0 short=0,0,0,0 dropped=0,0,0,0 00:31:43.987 latency : target=0, window=0, percentile=100.00%, depth=8 00:31:43.987 00:31:43.987 Run status group 0 (all jobs): 00:31:43.987 WRITE: bw=163MiB/s (171MB/s), 163MiB/s-163MiB/s (171MB/s-171MB/s), io=1632MiB (1711MB), run=10001-10001msec 00:31:43.987 TRIM: bw=163MiB/s (171MB/s), 163MiB/s-163MiB/s (171MB/s-171MB/s), io=1632MiB (1711MB), run=10001-10001msec 00:31:43.987 00:31:43.987 real 0m11.110s 00:31:43.987 user 0m27.893s 00:31:43.987 sys 0m0.300s 00:31:43.987 16:10:49 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:31:43.988 ************************************ 00:31:43.988 END TEST bdev_fio_trim 00:31:43.988 ************************************ 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:31:43.988 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:31:43.988 00:31:43.988 real 0m22.504s 00:31:43.988 user 0m56.513s 00:31:43.988 sys 0m0.778s 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:31:43.988 ************************************ 00:31:43.988 END TEST bdev_fio 00:31:43.988 ************************************ 00:31:43.988 16:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:31:43.988 16:10:49 blockdev_crypto_sw -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:43.988 16:10:49 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:31:43.988 16:10:49 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:43.988 16:10:49 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:43.988 ************************************ 00:31:43.988 START TEST bdev_verify 00:31:43.988 ************************************ 00:31:43.988 16:10:49 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:31:43.988 [2024-06-10 16:10:49.367421] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:43.988 [2024-06-10 16:10:49.367473] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2870723 ] 00:31:43.988 [2024-06-10 16:10:49.469141] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:44.247 [2024-06-10 16:10:49.562363] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:31:44.247 [2024-06-10 16:10:49.562369] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:44.247 [2024-06-10 16:10:49.727679] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:44.247 [2024-06-10 16:10:49.727744] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:44.247 [2024-06-10 16:10:49.727756] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:44.247 [2024-06-10 16:10:49.735700] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:44.247 [2024-06-10 16:10:49.735718] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:44.247 [2024-06-10 16:10:49.735727] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:44.247 [2024-06-10 16:10:49.743722] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:44.247 [2024-06-10 16:10:49.743740] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:44.247 [2024-06-10 16:10:49.743748] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:44.506 Running I/O for 5 seconds... 00:31:49.819 00:31:49.819 Latency(us) 00:31:49.819 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:49.819 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:49.819 Verification LBA range: start 0x0 length 0x800 00:31:49.819 crypto_ram : 5.02 4819.56 18.83 0.00 0.00 26433.68 1685.21 34203.55 00:31:49.819 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:49.819 Verification LBA range: start 0x800 length 0x800 00:31:49.819 crypto_ram : 5.02 4820.84 18.83 0.00 0.00 26425.79 1981.68 34203.55 00:31:49.819 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:31:49.819 Verification LBA range: start 0x0 length 0x800 00:31:49.819 crypto_ram3 : 5.03 2418.68 9.45 0.00 0.00 52555.95 1872.46 39446.43 00:31:49.819 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:31:49.819 Verification LBA range: start 0x800 length 0x800 00:31:49.819 crypto_ram3 : 5.03 2419.30 9.45 0.00 0.00 52548.15 2168.93 39446.43 00:31:49.819 =================================================================================================================== 00:31:49.819 Total : 14478.39 56.56 0.00 0.00 35167.83 1685.21 39446.43 00:31:49.819 00:31:49.819 real 0m5.739s 00:31:49.819 user 0m10.874s 00:31:49.819 sys 0m0.193s 00:31:49.819 16:10:55 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:49.819 16:10:55 blockdev_crypto_sw.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:31:49.819 ************************************ 00:31:49.819 END TEST bdev_verify 00:31:49.819 ************************************ 00:31:49.819 16:10:55 blockdev_crypto_sw -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:49.819 16:10:55 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:31:49.819 16:10:55 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:49.819 16:10:55 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:49.819 ************************************ 00:31:49.819 START TEST bdev_verify_big_io 00:31:49.819 ************************************ 00:31:49.819 16:10:55 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:31:49.819 [2024-06-10 16:10:55.162794] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:49.819 [2024-06-10 16:10:55.162845] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2871648 ] 00:31:49.819 [2024-06-10 16:10:55.251458] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:31:50.077 [2024-06-10 16:10:55.343617] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:31:50.077 [2024-06-10 16:10:55.343623] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:50.077 [2024-06-10 16:10:55.501660] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:50.077 [2024-06-10 16:10:55.501724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:50.077 [2024-06-10 16:10:55.501737] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:50.077 [2024-06-10 16:10:55.509683] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:50.077 [2024-06-10 16:10:55.509702] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:50.077 [2024-06-10 16:10:55.509711] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:50.077 [2024-06-10 16:10:55.517707] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:50.077 [2024-06-10 16:10:55.517724] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:50.077 [2024-06-10 16:10:55.517733] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:50.077 Running I/O for 5 seconds... 00:31:56.643 00:31:56.643 Latency(us) 00:31:56.643 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:56.643 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:56.643 Verification LBA range: start 0x0 length 0x80 00:31:56.643 crypto_ram : 5.21 393.29 24.58 0.00 0.00 317058.51 6241.52 419430.40 00:31:56.643 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:56.643 Verification LBA range: start 0x80 length 0x80 00:31:56.643 crypto_ram : 5.20 393.73 24.61 0.00 0.00 316805.29 7021.71 417433.11 00:31:56.643 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:31:56.643 Verification LBA range: start 0x0 length 0x80 00:31:56.643 crypto_ram3 : 5.42 212.59 13.29 0.00 0.00 562764.49 6116.69 439403.28 00:31:56.643 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:31:56.643 Verification LBA range: start 0x80 length 0x80 00:31:56.643 crypto_ram3 : 5.41 212.84 13.30 0.00 0.00 562396.46 6740.85 453384.29 00:31:56.643 =================================================================================================================== 00:31:56.643 Total : 1212.45 75.78 0.00 0.00 405365.39 6116.69 453384.29 00:31:56.643 00:31:56.643 real 0m6.117s 00:31:56.643 user 0m11.655s 00:31:56.643 sys 0m0.189s 00:31:56.643 16:11:01 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:56.643 16:11:01 blockdev_crypto_sw.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:31:56.643 ************************************ 00:31:56.643 END TEST bdev_verify_big_io 00:31:56.643 ************************************ 00:31:56.643 16:11:01 blockdev_crypto_sw -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.643 16:11:01 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:56.643 16:11:01 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:56.643 16:11:01 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:56.643 ************************************ 00:31:56.643 START TEST bdev_write_zeroes 00:31:56.643 ************************************ 00:31:56.643 16:11:01 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:56.643 [2024-06-10 16:11:01.355599] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:56.643 [2024-06-10 16:11:01.355653] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2872609 ] 00:31:56.643 [2024-06-10 16:11:01.454690] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:56.643 [2024-06-10 16:11:01.545826] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:56.643 [2024-06-10 16:11:01.715422] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:56.643 [2024-06-10 16:11:01.715487] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:31:56.643 [2024-06-10 16:11:01.715499] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.643 [2024-06-10 16:11:01.723440] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw2" 00:31:56.643 [2024-06-10 16:11:01.723457] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:31:56.643 [2024-06-10 16:11:01.723465] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.643 [2024-06-10 16:11:01.731479] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw3" 00:31:56.643 [2024-06-10 16:11:01.731497] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: crypto_ram2 00:31:56.643 [2024-06-10 16:11:01.731506] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:31:56.643 Running I/O for 1 seconds... 00:31:57.580 00:31:57.580 Latency(us) 00:31:57.580 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:31:57.580 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:57.580 crypto_ram : 1.01 24209.58 94.57 0.00 0.00 5274.42 2262.55 7302.58 00:31:57.580 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:31:57.580 crypto_ram3 : 1.01 12134.28 47.40 0.00 0.00 10468.80 3666.90 10860.25 00:31:57.580 =================================================================================================================== 00:31:57.580 Total : 36343.87 141.97 0.00 0.00 7011.91 2262.55 10860.25 00:31:57.580 00:31:57.580 real 0m1.701s 00:31:57.580 user 0m1.494s 00:31:57.580 sys 0m0.187s 00:31:57.580 16:11:02 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:57.580 16:11:02 blockdev_crypto_sw.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:31:57.580 ************************************ 00:31:57.580 END TEST bdev_write_zeroes 00:31:57.580 ************************************ 00:31:57.580 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:57.580 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:57.580 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:57.580 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:57.580 ************************************ 00:31:57.580 START TEST bdev_json_nonenclosed 00:31:57.580 ************************************ 00:31:57.580 16:11:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:57.840 [2024-06-10 16:11:03.100460] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:57.840 [2024-06-10 16:11:03.100497] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2873026 ] 00:31:57.840 [2024-06-10 16:11:03.187332] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:57.840 [2024-06-10 16:11:03.277920] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:57.840 [2024-06-10 16:11:03.277991] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:31:57.840 [2024-06-10 16:11:03.278008] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:57.840 [2024-06-10 16:11:03.278017] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:58.099 00:31:58.099 real 0m0.302s 00:31:58.099 user 0m0.192s 00:31:58.099 sys 0m0.108s 00:31:58.099 16:11:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:58.099 16:11:03 blockdev_crypto_sw.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:31:58.099 ************************************ 00:31:58.099 END TEST bdev_json_nonenclosed 00:31:58.099 ************************************ 00:31:58.100 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:58.100 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:31:58.100 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:58.100 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.100 ************************************ 00:31:58.100 START TEST bdev_json_nonarray 00:31:58.100 ************************************ 00:31:58.100 16:11:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:31:58.100 [2024-06-10 16:11:03.484441] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:58.100 [2024-06-10 16:11:03.484491] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2873055 ] 00:31:58.100 [2024-06-10 16:11:03.583163] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:58.359 [2024-06-10 16:11:03.673991] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:31:58.359 [2024-06-10 16:11:03.674063] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:31:58.359 [2024-06-10 16:11:03.674080] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:31:58.359 [2024-06-10 16:11:03.674089] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:31:58.359 00:31:58.359 real 0m0.332s 00:31:58.359 user 0m0.221s 00:31:58.359 sys 0m0.109s 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:31:58.359 ************************************ 00:31:58.359 END TEST bdev_json_nonarray 00:31:58.359 ************************************ 00:31:58.359 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@787 -- # [[ crypto_sw == bdev ]] 00:31:58.359 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@794 -- # [[ crypto_sw == gpt ]] 00:31:58.359 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@798 -- # [[ crypto_sw == crypto_sw ]] 00:31:58.359 16:11:03 blockdev_crypto_sw -- bdev/blockdev.sh@799 -- # run_test bdev_crypto_enomem bdev_crypto_enomem 00:31:58.359 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:31:58.359 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@1106 -- # xtrace_disable 00:31:58.359 16:11:03 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:31:58.359 ************************************ 00:31:58.359 START TEST bdev_crypto_enomem 00:31:58.359 ************************************ 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1124 -- # bdev_crypto_enomem 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@635 -- # local base_dev=base0 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@636 -- # local test_dev=crypt0 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@637 -- # local err_dev=EE_base0 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@638 -- # local qd=32 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@641 -- # ERR_PID=2873076 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@640 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -z -m 0x2 -q 32 -o 4096 -w randwrite -t 5 -f '' 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@642 -- # trap 'cleanup; killprocess $ERR_PID; exit 1' SIGINT SIGTERM EXIT 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@643 -- # waitforlisten 2873076 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@830 -- # '[' -z 2873076 ']' 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@835 -- # local max_retries=100 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:31:58.359 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@839 -- # xtrace_disable 00:31:58.359 16:11:03 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:31:58.618 [2024-06-10 16:11:03.877269] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:31:58.618 [2024-06-10 16:11:03.877321] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x2 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2873076 ] 00:31:58.618 [2024-06-10 16:11:03.966426] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:31:58.618 [2024-06-10 16:11:04.060221] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:31:58.876 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@863 -- # return 0 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@645 -- # rpc_cmd 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:31:58.877 true 00:31:58.877 base0 00:31:58.877 true 00:31:58.877 [2024-06-10 16:11:04.206009] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_sw" 00:31:58.877 crypt0 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@652 -- # waitforbdev crypt0 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@898 -- # local bdev_name=crypt0 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@899 -- # local bdev_timeout= 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@900 -- # local i 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # [[ -z '' ]] 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@901 -- # bdev_timeout=2000 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@903 -- # rpc_cmd bdev_wait_for_examine 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@905 -- # rpc_cmd bdev_get_bdevs -b crypt0 -t 2000 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:31:58.877 [ 00:31:58.877 { 00:31:58.877 "name": "crypt0", 00:31:58.877 "aliases": [ 00:31:58.877 "3703ce3d-aa05-5f23-ac36-ca34ee2f2f0f" 00:31:58.877 ], 00:31:58.877 "product_name": "crypto", 00:31:58.877 "block_size": 512, 00:31:58.877 "num_blocks": 2097152, 00:31:58.877 "uuid": "3703ce3d-aa05-5f23-ac36-ca34ee2f2f0f", 00:31:58.877 "assigned_rate_limits": { 00:31:58.877 "rw_ios_per_sec": 0, 00:31:58.877 "rw_mbytes_per_sec": 0, 00:31:58.877 "r_mbytes_per_sec": 0, 00:31:58.877 "w_mbytes_per_sec": 0 00:31:58.877 }, 00:31:58.877 "claimed": false, 00:31:58.877 "zoned": false, 00:31:58.877 "supported_io_types": { 00:31:58.877 "read": true, 00:31:58.877 "write": true, 00:31:58.877 "unmap": false, 00:31:58.877 "write_zeroes": true, 00:31:58.877 "flush": false, 00:31:58.877 "reset": true, 00:31:58.877 "compare": false, 00:31:58.877 "compare_and_write": false, 00:31:58.877 "abort": false, 00:31:58.877 "nvme_admin": false, 00:31:58.877 "nvme_io": false 00:31:58.877 }, 00:31:58.877 "memory_domains": [ 00:31:58.877 { 00:31:58.877 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:31:58.877 "dma_device_type": 2 00:31:58.877 } 00:31:58.877 ], 00:31:58.877 "driver_specific": { 00:31:58.877 "crypto": { 00:31:58.877 "base_bdev_name": "EE_base0", 00:31:58.877 "name": "crypt0", 00:31:58.877 "key_name": "test_dek_sw" 00:31:58.877 } 00:31:58.877 } 00:31:58.877 } 00:31:58.877 ] 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@906 -- # return 0 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@655 -- # rpcpid=2873162 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@657 -- # sleep 1 00:31:58.877 16:11:04 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@654 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:31:58.877 Running I/O for 5 seconds... 00:31:59.814 16:11:05 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@658 -- # rpc_cmd bdev_error_inject_error EE_base0 -n 5 -q 31 write nomem 00:31:59.814 16:11:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:31:59.814 16:11:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:31:59.814 16:11:05 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:31:59.814 16:11:05 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@660 -- # wait 2873162 00:32:04.006 00:32:04.006 Latency(us) 00:32:04.006 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:04.006 Job: crypt0 (Core Mask 0x2, workload: randwrite, depth: 32, IO size: 4096) 00:32:04.006 crypt0 : 5.00 33093.36 129.27 0.00 0.00 962.94 456.41 1256.11 00:32:04.006 =================================================================================================================== 00:32:04.006 Total : 33093.36 129.27 0.00 0.00 962.94 456.41 1256.11 00:32:04.006 0 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@662 -- # rpc_cmd bdev_crypto_delete crypt0 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@664 -- # killprocess 2873076 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@949 -- # '[' -z 2873076 ']' 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@953 -- # kill -0 2873076 00:32:04.006 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # uname 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2873076 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2873076' 00:32:04.007 killing process with pid 2873076 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@968 -- # kill 2873076 00:32:04.007 Received shutdown signal, test time was about 5.000000 seconds 00:32:04.007 00:32:04.007 Latency(us) 00:32:04.007 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:32:04.007 =================================================================================================================== 00:32:04.007 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:32:04.007 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@973 -- # wait 2873076 00:32:04.266 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- bdev/blockdev.sh@665 -- # trap - SIGINT SIGTERM EXIT 00:32:04.266 00:32:04.266 real 0m5.798s 00:32:04.266 user 0m5.939s 00:32:04.266 sys 0m0.269s 00:32:04.266 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:04.266 16:11:09 blockdev_crypto_sw.bdev_crypto_enomem -- common/autotest_common.sh@10 -- # set +x 00:32:04.266 ************************************ 00:32:04.266 END TEST bdev_crypto_enomem 00:32:04.266 ************************************ 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@811 -- # cleanup 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@26 -- # [[ crypto_sw == rbd ]] 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@30 -- # [[ crypto_sw == daos ]] 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@34 -- # [[ crypto_sw = \g\p\t ]] 00:32:04.266 16:11:09 blockdev_crypto_sw -- bdev/blockdev.sh@40 -- # [[ crypto_sw == xnvme ]] 00:32:04.266 00:32:04.266 real 0m53.322s 00:32:04.266 user 1m42.997s 00:32:04.266 sys 0m5.184s 00:32:04.266 16:11:09 blockdev_crypto_sw -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:04.266 16:11:09 blockdev_crypto_sw -- common/autotest_common.sh@10 -- # set +x 00:32:04.266 ************************************ 00:32:04.266 END TEST blockdev_crypto_sw 00:32:04.266 ************************************ 00:32:04.266 16:11:09 -- spdk/autotest.sh@359 -- # run_test blockdev_crypto_qat /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:04.266 16:11:09 -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:32:04.266 16:11:09 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:04.266 16:11:09 -- common/autotest_common.sh@10 -- # set +x 00:32:04.266 ************************************ 00:32:04.266 START TEST blockdev_crypto_qat 00:32:04.266 ************************************ 00:32:04.266 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/blockdev.sh crypto_qat 00:32:04.525 * Looking for test storage... 00:32:04.525 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@10 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbd_common.sh 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/nbd_common.sh@6 -- # set -e 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@12 -- # rpc_py=rpc_cmd 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@13 -- # conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@14 -- # nonenclosed_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@15 -- # nonarray_conf_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # export RPC_PIPE_TIMEOUT=30 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@17 -- # RPC_PIPE_TIMEOUT=30 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@20 -- # : 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@670 -- # QOS_DEV_1=Malloc_0 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@671 -- # QOS_DEV_2=Null_1 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@672 -- # QOS_RUN_TIME=5 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # uname -s 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@674 -- # '[' Linux = Linux ']' 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@676 -- # PRE_RESERVED_MEM=0 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@682 -- # test_type=crypto_qat 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@683 -- # crypto_device= 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@684 -- # dek= 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@685 -- # env_ctx= 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@686 -- # wait_for_rpc= 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@687 -- # '[' -n '' ']' 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == bdev ]] 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@690 -- # [[ crypto_qat == crypto_* ]] 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@691 -- # wait_for_rpc=--wait-for-rpc 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@693 -- # start_spdk_tgt 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@47 -- # spdk_tgt_pid=2874063 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@48 -- # trap 'killprocess "$spdk_tgt_pid"; exit 1' SIGINT SIGTERM EXIT 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@46 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_tgt '' --wait-for-rpc 00:32:04.525 16:11:09 blockdev_crypto_qat -- bdev/blockdev.sh@49 -- # waitforlisten 2874063 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@830 -- # '[' -z 2874063 ']' 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:04.525 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:04.525 16:11:09 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:04.525 [2024-06-10 16:11:09.911969] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:32:04.525 [2024-06-10 16:11:09.912030] [ DPDK EAL parameters: spdk_tgt --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2874063 ] 00:32:04.525 [2024-06-10 16:11:10.031357] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:04.785 [2024-06-10 16:11:10.140881] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:05.353 16:11:10 blockdev_crypto_qat -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:05.353 16:11:10 blockdev_crypto_qat -- common/autotest_common.sh@863 -- # return 0 00:32:05.353 16:11:10 blockdev_crypto_qat -- bdev/blockdev.sh@694 -- # case "$test_type" in 00:32:05.353 16:11:10 blockdev_crypto_qat -- bdev/blockdev.sh@708 -- # setup_crypto_qat_conf 00:32:05.353 16:11:10 blockdev_crypto_qat -- bdev/blockdev.sh@170 -- # rpc_cmd 00:32:05.353 16:11:10 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:05.353 16:11:10 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:05.353 [2024-06-10 16:11:10.790965] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:05.353 [2024-06-10 16:11:10.799009] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:05.353 [2024-06-10 16:11:10.807018] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:05.612 [2024-06-10 16:11:10.878503] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:08.148 true 00:32:08.148 true 00:32:08.148 true 00:32:08.148 true 00:32:08.148 Malloc0 00:32:08.148 Malloc1 00:32:08.148 Malloc2 00:32:08.148 Malloc3 00:32:08.148 [2024-06-10 16:11:13.215928] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:08.148 crypto_ram 00:32:08.148 [2024-06-10 16:11:13.223949] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:08.148 crypto_ram1 00:32:08.148 [2024-06-10 16:11:13.231978] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:08.148 crypto_ram2 00:32:08.148 [2024-06-10 16:11:13.239999] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:08.148 crypto_ram3 00:32:08.148 [ 00:32:08.148 { 00:32:08.148 "name": "Malloc1", 00:32:08.148 "aliases": [ 00:32:08.148 "7ba11436-e97a-4ce9-a933-7e6c1770eea0" 00:32:08.148 ], 00:32:08.148 "product_name": "Malloc disk", 00:32:08.148 "block_size": 512, 00:32:08.148 "num_blocks": 65536, 00:32:08.148 "uuid": "7ba11436-e97a-4ce9-a933-7e6c1770eea0", 00:32:08.148 "assigned_rate_limits": { 00:32:08.148 "rw_ios_per_sec": 0, 00:32:08.148 "rw_mbytes_per_sec": 0, 00:32:08.148 "r_mbytes_per_sec": 0, 00:32:08.148 "w_mbytes_per_sec": 0 00:32:08.148 }, 00:32:08.148 "claimed": true, 00:32:08.148 "claim_type": "exclusive_write", 00:32:08.148 "zoned": false, 00:32:08.148 "supported_io_types": { 00:32:08.148 "read": true, 00:32:08.148 "write": true, 00:32:08.148 "unmap": true, 00:32:08.148 "write_zeroes": true, 00:32:08.148 "flush": true, 00:32:08.148 "reset": true, 00:32:08.148 "compare": false, 00:32:08.148 "compare_and_write": false, 00:32:08.148 "abort": true, 00:32:08.148 "nvme_admin": false, 00:32:08.148 "nvme_io": false 00:32:08.148 }, 00:32:08.148 "memory_domains": [ 00:32:08.148 { 00:32:08.148 "dma_device_id": "system", 00:32:08.148 "dma_device_type": 1 00:32:08.148 }, 00:32:08.148 { 00:32:08.148 "dma_device_id": "SPDK_ACCEL_DMA_DEVICE", 00:32:08.148 "dma_device_type": 2 00:32:08.148 } 00:32:08.148 ], 00:32:08.148 "driver_specific": {} 00:32:08.148 } 00:32:08.148 ] 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@737 -- # rpc_cmd bdev_wait_for_examine 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # cat 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n accel 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n bdev 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@740 -- # rpc_cmd save_subsystem_config -n iobuf 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # mapfile -t bdevs 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # rpc_cmd bdev_get_bdevs 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@748 -- # jq -r '.[] | select(.claimed == false)' 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@560 -- # xtrace_disable 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # mapfile -t bdevs_name 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "793a42c6-5e21-5469-b370-883d1a31cdf7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "793a42c6-5e21-5469-b370-883d1a31cdf7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "edf75366-0a8b-5051-86f8-abbf31831fa3"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "edf75366-0a8b-5051-86f8-abbf31831fa3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3683e8ee-3546-5971-a425-c6af60ecfbed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3683e8ee-3546-5971-a425-c6af60ecfbed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52946918-dc89-5eaf-a2ad-219e4430e0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52946918-dc89-5eaf-a2ad-219e4430e0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@749 -- # jq -r .name 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@750 -- # bdev_list=("${bdevs_name[@]}") 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@752 -- # hello_world_bdev=crypto_ram 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@753 -- # trap - SIGINT SIGTERM EXIT 00:32:08.148 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@754 -- # killprocess 2874063 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@949 -- # '[' -z 2874063 ']' 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@953 -- # kill -0 2874063 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # uname 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2874063 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:08.148 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:08.149 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2874063' 00:32:08.149 killing process with pid 2874063 00:32:08.149 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@968 -- # kill 2874063 00:32:08.149 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@973 -- # wait 2874063 00:32:08.717 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@758 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:08.717 16:11:13 blockdev_crypto_qat -- bdev/blockdev.sh@760 -- # run_test bdev_hello_world /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:08.717 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 7 -le 1 ']' 00:32:08.717 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:08.717 16:11:13 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:08.717 ************************************ 00:32:08.717 START TEST bdev_hello_world 00:32:08.717 ************************************ 00:32:08.717 16:11:13 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/hello_bdev --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -b crypto_ram '' 00:32:08.717 [2024-06-10 16:11:14.015381] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:32:08.717 [2024-06-10 16:11:14.015433] [ DPDK EAL parameters: hello_bdev --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2874745 ] 00:32:08.717 [2024-06-10 16:11:14.112067] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:08.717 [2024-06-10 16:11:14.203442] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:08.717 [2024-06-10 16:11:14.224753] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:08.976 [2024-06-10 16:11:14.232790] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:08.976 [2024-06-10 16:11:14.240803] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:08.976 [2024-06-10 16:11:14.346780] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:11.512 [2024-06-10 16:11:16.521312] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:11.512 [2024-06-10 16:11:16.521376] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:11.512 [2024-06-10 16:11:16.521388] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.512 [2024-06-10 16:11:16.529332] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:11.512 [2024-06-10 16:11:16.529350] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:11.512 [2024-06-10 16:11:16.529359] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.512 [2024-06-10 16:11:16.537354] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:11.512 [2024-06-10 16:11:16.537370] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:11.512 [2024-06-10 16:11:16.537379] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.512 [2024-06-10 16:11:16.545375] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:11.512 [2024-06-10 16:11:16.545394] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:11.512 [2024-06-10 16:11:16.545403] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:11.512 [2024-06-10 16:11:16.617988] hello_bdev.c: 222:hello_start: *NOTICE*: Successfully started the application 00:32:11.512 [2024-06-10 16:11:16.618030] hello_bdev.c: 231:hello_start: *NOTICE*: Opening the bdev crypto_ram 00:32:11.512 [2024-06-10 16:11:16.618046] hello_bdev.c: 244:hello_start: *NOTICE*: Opening io channel 00:32:11.512 [2024-06-10 16:11:16.619370] hello_bdev.c: 138:hello_write: *NOTICE*: Writing to the bdev 00:32:11.512 [2024-06-10 16:11:16.619443] hello_bdev.c: 117:write_complete: *NOTICE*: bdev io write completed successfully 00:32:11.512 [2024-06-10 16:11:16.619458] hello_bdev.c: 84:hello_read: *NOTICE*: Reading io 00:32:11.512 [2024-06-10 16:11:16.619501] hello_bdev.c: 65:read_complete: *NOTICE*: Read string from bdev : Hello World! 00:32:11.512 00:32:11.512 [2024-06-10 16:11:16.619518] hello_bdev.c: 74:read_complete: *NOTICE*: Stopping app 00:32:11.512 00:32:11.512 real 0m2.987s 00:32:11.512 user 0m2.658s 00:32:11.512 sys 0m0.294s 00:32:11.512 16:11:16 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:11.512 16:11:16 blockdev_crypto_qat.bdev_hello_world -- common/autotest_common.sh@10 -- # set +x 00:32:11.512 ************************************ 00:32:11.512 END TEST bdev_hello_world 00:32:11.512 ************************************ 00:32:11.512 16:11:16 blockdev_crypto_qat -- bdev/blockdev.sh@761 -- # run_test bdev_bounds bdev_bounds '' 00:32:11.512 16:11:16 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:32:11.512 16:11:16 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:11.512 16:11:16 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:11.512 ************************************ 00:32:11.512 START TEST bdev_bounds 00:32:11.512 ************************************ 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1124 -- # bdev_bounds '' 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@290 -- # bdevio_pid=2875258 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@291 -- # trap 'cleanup; killprocess $bdevio_pid; exit 1' SIGINT SIGTERM EXIT 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@292 -- # echo 'Process bdevio pid: 2875258' 00:32:11.512 Process bdevio pid: 2875258 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@293 -- # waitforlisten 2875258 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@830 -- # '[' -z 2875258 ']' 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@289 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/bdevio -w -s 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:11.512 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:32:11.512 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:32:11.513 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:11.513 16:11:17 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:11.771 [2024-06-10 16:11:17.068694] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:32:11.771 [2024-06-10 16:11:17.068748] [ DPDK EAL parameters: bdevio --no-shconf -c 0x7 -m 0 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2875258 ] 00:32:11.771 [2024-06-10 16:11:17.167756] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 3 00:32:11.771 [2024-06-10 16:11:17.264153] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:32:11.771 [2024-06-10 16:11:17.264249] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 2 00:32:11.771 [2024-06-10 16:11:17.264254] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:12.030 [2024-06-10 16:11:17.285606] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:12.030 [2024-06-10 16:11:17.293636] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:12.030 [2024-06-10 16:11:17.301651] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:12.030 [2024-06-10 16:11:17.400335] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:14.566 [2024-06-10 16:11:19.567920] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:14.566 [2024-06-10 16:11:19.567999] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:14.566 [2024-06-10 16:11:19.568012] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.566 [2024-06-10 16:11:19.575936] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:14.566 [2024-06-10 16:11:19.575953] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:14.566 [2024-06-10 16:11:19.575967] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.566 [2024-06-10 16:11:19.583963] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:14.566 [2024-06-10 16:11:19.583979] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:14.566 [2024-06-10 16:11:19.583988] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.566 [2024-06-10 16:11:19.591985] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:14.566 [2024-06-10 16:11:19.592001] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:14.566 [2024-06-10 16:11:19.592010] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:14.566 16:11:19 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:14.566 16:11:19 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@863 -- # return 0 00:32:14.566 16:11:19 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@294 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdevio/tests.py perform_tests 00:32:14.566 I/O targets: 00:32:14.566 crypto_ram: 65536 blocks of 512 bytes (32 MiB) 00:32:14.566 crypto_ram1: 65536 blocks of 512 bytes (32 MiB) 00:32:14.566 crypto_ram2: 8192 blocks of 4096 bytes (32 MiB) 00:32:14.566 crypto_ram3: 8192 blocks of 4096 bytes (32 MiB) 00:32:14.566 00:32:14.566 00:32:14.566 CUnit - A unit testing framework for C - Version 2.1-3 00:32:14.566 http://cunit.sourceforge.net/ 00:32:14.566 00:32:14.566 00:32:14.566 Suite: bdevio tests on: crypto_ram3 00:32:14.566 Test: blockdev write read block ...passed 00:32:14.566 Test: blockdev write zeroes read block ...passed 00:32:14.566 Test: blockdev write zeroes read no split ...passed 00:32:14.566 Test: blockdev write zeroes read split ...passed 00:32:14.566 Test: blockdev write zeroes read split partial ...passed 00:32:14.566 Test: blockdev reset ...passed 00:32:14.566 Test: blockdev write read 8 blocks ...passed 00:32:14.566 Test: blockdev write read size > 128k ...passed 00:32:14.566 Test: blockdev write read invalid size ...passed 00:32:14.566 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:14.566 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:14.566 Test: blockdev write read max offset ...passed 00:32:14.566 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:14.566 Test: blockdev writev readv 8 blocks ...passed 00:32:14.566 Test: blockdev writev readv 30 x 1block ...passed 00:32:14.566 Test: blockdev writev readv block ...passed 00:32:14.566 Test: blockdev writev readv size > 128k ...passed 00:32:14.566 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:14.566 Test: blockdev comparev and writev ...passed 00:32:14.566 Test: blockdev nvme passthru rw ...passed 00:32:14.566 Test: blockdev nvme passthru vendor specific ...passed 00:32:14.567 Test: blockdev nvme admin passthru ...passed 00:32:14.567 Test: blockdev copy ...passed 00:32:14.567 Suite: bdevio tests on: crypto_ram2 00:32:14.567 Test: blockdev write read block ...passed 00:32:14.567 Test: blockdev write zeroes read block ...passed 00:32:14.567 Test: blockdev write zeroes read no split ...passed 00:32:14.567 Test: blockdev write zeroes read split ...passed 00:32:14.567 Test: blockdev write zeroes read split partial ...passed 00:32:14.567 Test: blockdev reset ...passed 00:32:14.567 Test: blockdev write read 8 blocks ...passed 00:32:14.567 Test: blockdev write read size > 128k ...passed 00:32:14.567 Test: blockdev write read invalid size ...passed 00:32:14.567 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:14.567 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:14.567 Test: blockdev write read max offset ...passed 00:32:14.567 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:14.567 Test: blockdev writev readv 8 blocks ...passed 00:32:14.567 Test: blockdev writev readv 30 x 1block ...passed 00:32:14.567 Test: blockdev writev readv block ...passed 00:32:14.567 Test: blockdev writev readv size > 128k ...passed 00:32:14.567 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:14.567 Test: blockdev comparev and writev ...passed 00:32:14.567 Test: blockdev nvme passthru rw ...passed 00:32:14.567 Test: blockdev nvme passthru vendor specific ...passed 00:32:14.567 Test: blockdev nvme admin passthru ...passed 00:32:14.567 Test: blockdev copy ...passed 00:32:14.567 Suite: bdevio tests on: crypto_ram1 00:32:14.567 Test: blockdev write read block ...passed 00:32:14.567 Test: blockdev write zeroes read block ...passed 00:32:14.567 Test: blockdev write zeroes read no split ...passed 00:32:14.567 Test: blockdev write zeroes read split ...passed 00:32:14.567 Test: blockdev write zeroes read split partial ...passed 00:32:14.567 Test: blockdev reset ...passed 00:32:14.567 Test: blockdev write read 8 blocks ...passed 00:32:14.567 Test: blockdev write read size > 128k ...passed 00:32:14.567 Test: blockdev write read invalid size ...passed 00:32:14.567 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:14.567 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:14.567 Test: blockdev write read max offset ...passed 00:32:14.567 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:14.567 Test: blockdev writev readv 8 blocks ...passed 00:32:14.567 Test: blockdev writev readv 30 x 1block ...passed 00:32:14.567 Test: blockdev writev readv block ...passed 00:32:14.567 Test: blockdev writev readv size > 128k ...passed 00:32:14.567 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:14.567 Test: blockdev comparev and writev ...passed 00:32:14.567 Test: blockdev nvme passthru rw ...passed 00:32:14.567 Test: blockdev nvme passthru vendor specific ...passed 00:32:14.567 Test: blockdev nvme admin passthru ...passed 00:32:14.567 Test: blockdev copy ...passed 00:32:14.567 Suite: bdevio tests on: crypto_ram 00:32:14.567 Test: blockdev write read block ...passed 00:32:14.567 Test: blockdev write zeroes read block ...passed 00:32:14.567 Test: blockdev write zeroes read no split ...passed 00:32:14.826 Test: blockdev write zeroes read split ...passed 00:32:14.826 Test: blockdev write zeroes read split partial ...passed 00:32:14.826 Test: blockdev reset ...passed 00:32:14.826 Test: blockdev write read 8 blocks ...passed 00:32:14.826 Test: blockdev write read size > 128k ...passed 00:32:14.826 Test: blockdev write read invalid size ...passed 00:32:14.826 Test: blockdev write read offset + nbytes == size of blockdev ...passed 00:32:14.826 Test: blockdev write read offset + nbytes > size of blockdev ...passed 00:32:14.826 Test: blockdev write read max offset ...passed 00:32:14.826 Test: blockdev write read 2 blocks on overlapped address offset ...passed 00:32:14.826 Test: blockdev writev readv 8 blocks ...passed 00:32:14.826 Test: blockdev writev readv 30 x 1block ...passed 00:32:14.826 Test: blockdev writev readv block ...passed 00:32:14.826 Test: blockdev writev readv size > 128k ...passed 00:32:14.826 Test: blockdev writev readv size > 128k in two iovs ...passed 00:32:14.826 Test: blockdev comparev and writev ...passed 00:32:14.826 Test: blockdev nvme passthru rw ...passed 00:32:14.826 Test: blockdev nvme passthru vendor specific ...passed 00:32:14.826 Test: blockdev nvme admin passthru ...passed 00:32:14.826 Test: blockdev copy ...passed 00:32:14.826 00:32:14.826 Run Summary: Type Total Ran Passed Failed Inactive 00:32:14.826 suites 4 4 n/a 0 0 00:32:14.826 tests 92 92 92 0 0 00:32:14.826 asserts 520 520 520 0 n/a 00:32:14.826 00:32:14.826 Elapsed time = 0.527 seconds 00:32:14.826 0 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@295 -- # killprocess 2875258 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@949 -- # '[' -z 2875258 ']' 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@953 -- # kill -0 2875258 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # uname 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2875258 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2875258' 00:32:14.826 killing process with pid 2875258 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@968 -- # kill 2875258 00:32:14.826 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@973 -- # wait 2875258 00:32:15.117 16:11:20 blockdev_crypto_qat.bdev_bounds -- bdev/blockdev.sh@296 -- # trap - SIGINT SIGTERM EXIT 00:32:15.117 00:32:15.117 real 0m3.540s 00:32:15.117 user 0m10.131s 00:32:15.117 sys 0m0.438s 00:32:15.117 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:15.117 16:11:20 blockdev_crypto_qat.bdev_bounds -- common/autotest_common.sh@10 -- # set +x 00:32:15.117 ************************************ 00:32:15.117 END TEST bdev_bounds 00:32:15.117 ************************************ 00:32:15.117 16:11:20 blockdev_crypto_qat -- bdev/blockdev.sh@762 -- # run_test bdev_nbd nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:15.117 16:11:20 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 5 -le 1 ']' 00:32:15.117 16:11:20 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:15.117 16:11:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:15.388 ************************************ 00:32:15.388 START TEST bdev_nbd 00:32:15.388 ************************************ 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1124 -- # nbd_function_test /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '' 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # uname -s 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@300 -- # [[ Linux == Linux ]] 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@302 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@303 -- # local conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # bdev_all=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@304 -- # local bdev_all 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@305 -- # local bdev_num=4 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@309 -- # [[ -e /sys/module/nbd ]] 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # nbd_all=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11' '/dev/nbd12' '/dev/nbd13' '/dev/nbd14' '/dev/nbd15' '/dev/nbd2' '/dev/nbd3' '/dev/nbd4' '/dev/nbd5' '/dev/nbd6' '/dev/nbd7' '/dev/nbd8' '/dev/nbd9') 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@311 -- # local nbd_all 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@312 -- # bdev_num=4 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@314 -- # local nbd_list 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@315 -- # local bdev_list 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@318 -- # nbd_pid=2875925 00:32:15.388 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@319 -- # trap 'cleanup; killprocess $nbd_pid' SIGINT SIGTERM EXIT 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@317 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/app/bdev_svc/bdev_svc -r /var/tmp/spdk-nbd.sock -i 0 --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json '' 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@320 -- # waitforlisten 2875925 /var/tmp/spdk-nbd.sock 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@830 -- # '[' -z 2875925 ']' 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk-nbd.sock 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@835 -- # local max_retries=100 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock...' 00:32:15.389 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk-nbd.sock... 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@839 -- # xtrace_disable 00:32:15.389 16:11:20 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:15.389 [2024-06-10 16:11:20.684740] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:32:15.389 [2024-06-10 16:11:20.684793] [ DPDK EAL parameters: bdev_svc -c 0x1 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:32:15.389 [2024-06-10 16:11:20.784636] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:32:15.389 [2024-06-10 16:11:20.880830] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:15.648 [2024-06-10 16:11:20.902181] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:15.648 [2024-06-10 16:11:20.910208] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:15.648 [2024-06-10 16:11:20.918223] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:15.648 [2024-06-10 16:11:21.023160] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:18.180 [2024-06-10 16:11:23.193828] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:18.180 [2024-06-10 16:11:23.193879] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:18.180 [2024-06-10 16:11:23.193891] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.180 [2024-06-10 16:11:23.201847] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:18.180 [2024-06-10 16:11:23.201866] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:18.180 [2024-06-10 16:11:23.201875] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.181 [2024-06-10 16:11:23.209868] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:18.181 [2024-06-10 16:11:23.209884] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:18.181 [2024-06-10 16:11:23.209893] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.181 [2024-06-10 16:11:23.217888] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:18.181 [2024-06-10 16:11:23.217904] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:18.181 [2024-06-10 16:11:23.217913] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@863 -- # return 0 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@322 -- # nbd_rpc_start_stop_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@113 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@114 -- # local bdev_list 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@116 -- # nbd_start_disks_without_nbd_idx /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@22 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@23 -- # local bdev_list 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@24 -- # local i 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@25 -- # local nbd_device 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i = 0 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd0 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd0 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd0 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.181 1+0 records in 00:32:18.181 1+0 records out 00:32:18.181 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000283698 s, 14.4 MB/s 00:32:18.181 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:18.439 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd1 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd1 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd1 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.698 1+0 records in 00:32:18.698 1+0 records out 00:32:18.698 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000263462 s, 15.5 MB/s 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:18.698 16:11:23 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd2 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd2 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd2 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd2 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd2 /proc/partitions 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd2 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:18.957 1+0 records in 00:32:18.957 1+0 records out 00:32:18.957 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000282828 s, 14.5 MB/s 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:18.957 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@28 -- # nbd_device=/dev/nbd3 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # basename /dev/nbd3 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@30 -- # waitfornbd nbd3 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd3 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd3 /proc/partitions 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd3 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:19.216 1+0 records in 00:32:19.216 1+0 records out 00:32:19.216 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000262591 s, 15.6 MB/s 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i++ )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@27 -- # (( i < 4 )) 00:32:19.216 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@118 -- # nbd_disks_json='[ 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd0", 00:32:19.475 "bdev_name": "crypto_ram" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd1", 00:32:19.475 "bdev_name": "crypto_ram1" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd2", 00:32:19.475 "bdev_name": "crypto_ram2" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd3", 00:32:19.475 "bdev_name": "crypto_ram3" 00:32:19.475 } 00:32:19.475 ]' 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # nbd_disks_name=($(echo "${nbd_disks_json}" | jq -r '.[] | .nbd_device')) 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # echo '[ 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd0", 00:32:19.475 "bdev_name": "crypto_ram" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd1", 00:32:19.475 "bdev_name": "crypto_ram1" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd2", 00:32:19.475 "bdev_name": "crypto_ram2" 00:32:19.475 }, 00:32:19.475 { 00:32:19.475 "nbd_device": "/dev/nbd3", 00:32:19.475 "bdev_name": "crypto_ram3" 00:32:19.475 } 00:32:19.475 ]' 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@119 -- # jq -r '.[] | .nbd_device' 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@120 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd2 /dev/nbd3' 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd2' '/dev/nbd3') 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.475 16:11:24 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.734 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:19.992 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd2 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd2 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd2 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd2 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd2 /proc/partitions 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:20.251 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd3 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd3 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd3 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd3 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd3 /proc/partitions 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.509 16:11:25 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@122 -- # count=0 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@123 -- # '[' 0 -ne 0 ']' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@127 -- # return 0 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@323 -- # nbd_rpc_data_verify /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@90 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@91 -- # local bdev_list 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@92 -- # local nbd_list 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@94 -- # nbd_start_disks /var/tmp/spdk-nbd.sock 'crypto_ram crypto_ram1 crypto_ram2 crypto_ram3' '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@9 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # bdev_list=('crypto_ram' 'crypto_ram1' 'crypto_ram2' 'crypto_ram3') 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@10 -- # local bdev_list 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@11 -- # local nbd_list 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@12 -- # local i 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i = 0 )) 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:20.766 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram /dev/nbd0 00:32:21.024 /dev/nbd0 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd0 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd0 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd0 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd0 /proc/partitions 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd0 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:21.024 1+0 records in 00:32:21.024 1+0 records out 00:32:21.024 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000277493 s, 14.8 MB/s 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:21.024 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram1 /dev/nbd1 00:32:21.282 /dev/nbd1 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd1 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd1 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd1 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd1 /proc/partitions 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd1 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:21.282 1+0 records in 00:32:21.282 1+0 records out 00:32:21.282 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000285926 s, 14.3 MB/s 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:21.282 16:11:26 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram2 /dev/nbd10 00:32:21.540 /dev/nbd10 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd10 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd10 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd10 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd10 /proc/partitions 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd10 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:21.540 1+0 records in 00:32:21.540 1+0 records out 00:32:21.540 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.000209106 s, 19.6 MB/s 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:21.540 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@15 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk crypto_ram3 /dev/nbd11 00:32:21.799 /dev/nbd11 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # basename /dev/nbd11 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@17 -- # waitfornbd nbd11 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@867 -- # local nbd_name=nbd11 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@868 -- # local i 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i = 1 )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@870 -- # (( i <= 20 )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@871 -- # grep -q -w nbd11 /proc/partitions 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@872 -- # break 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i = 1 )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@883 -- # (( i <= 20 )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@884 -- # dd if=/dev/nbd11 of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest bs=4096 count=1 iflag=direct 00:32:21.799 1+0 records in 00:32:21.799 1+0 records out 00:32:21.799 4096 bytes (4.1 kB, 4.0 KiB) copied, 0.00027418 s, 14.9 MB/s 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # stat -c %s /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@885 -- # size=4096 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@886 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdtest 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@887 -- # '[' 4096 '!=' 0 ']' 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@888 -- # return 0 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i++ )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@14 -- # (( i < 4 )) 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:21.799 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:22.058 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[ 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd0", 00:32:22.058 "bdev_name": "crypto_ram" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd1", 00:32:22.058 "bdev_name": "crypto_ram1" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd10", 00:32:22.058 "bdev_name": "crypto_ram2" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd11", 00:32:22.058 "bdev_name": "crypto_ram3" 00:32:22.058 } 00:32:22.058 ]' 00:32:22.058 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[ 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd0", 00:32:22.058 "bdev_name": "crypto_ram" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd1", 00:32:22.058 "bdev_name": "crypto_ram1" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd10", 00:32:22.058 "bdev_name": "crypto_ram2" 00:32:22.058 }, 00:32:22.058 { 00:32:22.058 "nbd_device": "/dev/nbd11", 00:32:22.058 "bdev_name": "crypto_ram3" 00:32:22.058 } 00:32:22.058 ]' 00:32:22.058 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name='/dev/nbd0 00:32:22.317 /dev/nbd1 00:32:22.317 /dev/nbd10 00:32:22.317 /dev/nbd11' 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '/dev/nbd0 00:32:22.317 /dev/nbd1 00:32:22.317 /dev/nbd10 00:32:22.317 /dev/nbd11' 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=4 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 4 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@95 -- # count=4 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@96 -- # '[' 4 -ne 4 ']' 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@100 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' write 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:22.317 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=write 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' write = write ']' 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@76 -- # dd if=/dev/urandom of=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest bs=4096 count=256 00:32:22.318 256+0 records in 00:32:22.318 256+0 records out 00:32:22.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.00997026 s, 105 MB/s 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd0 bs=4096 count=256 oflag=direct 00:32:22.318 256+0 records in 00:32:22.318 256+0 records out 00:32:22.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0564597 s, 18.6 MB/s 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd1 bs=4096 count=256 oflag=direct 00:32:22.318 256+0 records in 00:32:22.318 256+0 records out 00:32:22.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0482628 s, 21.7 MB/s 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd10 bs=4096 count=256 oflag=direct 00:32:22.318 256+0 records in 00:32:22.318 256+0 records out 00:32:22.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0398456 s, 26.3 MB/s 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@77 -- # for i in "${nbd_list[@]}" 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@78 -- # dd if=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest of=/dev/nbd11 bs=4096 count=256 oflag=direct 00:32:22.318 256+0 records in 00:32:22.318 256+0 records out 00:32:22.318 1048576 bytes (1.0 MB, 1.0 MiB) copied, 0.0367432 s, 28.5 MB/s 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@101 -- # nbd_dd_data_verify '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' verify 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@70 -- # local nbd_list 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@71 -- # local operation=verify 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@72 -- # local tmp_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@74 -- # '[' verify = write ']' 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@80 -- # '[' verify = verify ']' 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:22.318 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd0 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd1 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd10 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@82 -- # for i in "${nbd_list[@]}" 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@83 -- # cmp -b -n 1M /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest /dev/nbd11 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@85 -- # rm /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nbdrandtest 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@103 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:22.577 16:11:27 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:22.836 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd1 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd1 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd1 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd1 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd1 /proc/partitions 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:23.095 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd10 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd10 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd10 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd10 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd10 /proc/partitions 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:23.354 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd11 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd11 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd11 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd11 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd11 /proc/partitions 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # nbd_get_count /var/tmp/spdk-nbd.sock 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@61 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:23.613 16:11:28 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_get_disks 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@63 -- # nbd_disks_json='[]' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # echo '[]' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # jq -r '.[] | .nbd_device' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@64 -- # nbd_disks_name= 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # echo '' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # grep -c /dev/nbd 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # true 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@65 -- # count=0 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@66 -- # echo 0 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@104 -- # count=0 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@105 -- # '[' 0 -ne 0 ']' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@109 -- # return 0 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@324 -- # nbd_with_lvol_verify /var/tmp/spdk-nbd.sock '/dev/nbd0 /dev/nbd1 /dev/nbd10 /dev/nbd11' 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@131 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # nbd_list=('/dev/nbd0' '/dev/nbd1' '/dev/nbd10' '/dev/nbd11') 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@132 -- # local nbd_list 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@133 -- # local mkfs_ret 00:32:23.872 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@135 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_malloc_create -b malloc_lvol_verify 16 512 00:32:24.130 malloc_lvol_verify 00:32:24.130 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@136 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create_lvstore malloc_lvol_verify lvs 00:32:24.388 76b7bc87-bdb0-4ca7-b2f6-c9e1d9b389e0 00:32:24.388 16:11:29 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@137 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock bdev_lvol_create lvol 4 -l lvs 00:32:24.647 4a198b7b-9464-4b85-aea4-8f5bedf5b764 00:32:24.647 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@138 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_start_disk lvs/lvol /dev/nbd0 00:32:24.905 /dev/nbd0 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@140 -- # mkfs.ext4 /dev/nbd0 00:32:24.905 mke2fs 1.46.5 (30-Dec-2021) 00:32:24.905 Discarding device blocks: 0/4096 done 00:32:24.905 Creating filesystem with 4096 1k blocks and 1024 inodes 00:32:24.905 00:32:24.905 Allocating group tables: 0/1 done 00:32:24.905 Writing inode tables: 0/1 done 00:32:24.905 Creating journal (1024 blocks): done 00:32:24.905 Writing superblocks and filesystem accounting information: 0/1 done 00:32:24.905 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@141 -- # mkfs_ret=0 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@142 -- # nbd_stop_disks /var/tmp/spdk-nbd.sock /dev/nbd0 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@49 -- # local rpc_server=/var/tmp/spdk-nbd.sock 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # nbd_list=('/dev/nbd0') 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@50 -- # local nbd_list 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@51 -- # local i 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@53 -- # for i in "${nbd_list[@]}" 00:32:24.905 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@54 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/spdk-nbd.sock nbd_stop_disk /dev/nbd0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # basename /dev/nbd0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@55 -- # waitfornbd_exit nbd0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@35 -- # local nbd_name=nbd0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i = 1 )) 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@37 -- # (( i <= 20 )) 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@38 -- # grep -q -w nbd0 /proc/partitions 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@41 -- # break 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@45 -- # return 0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@143 -- # '[' 0 -ne 0 ']' 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/nbd_common.sh@147 -- # return 0 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@326 -- # killprocess 2875925 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@949 -- # '[' -z 2875925 ']' 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@953 -- # kill -0 2875925 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # uname 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:32:25.164 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2875925 00:32:25.422 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:32:25.422 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:32:25.422 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2875925' 00:32:25.422 killing process with pid 2875925 00:32:25.422 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@968 -- # kill 2875925 00:32:25.422 16:11:30 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@973 -- # wait 2875925 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_nbd -- bdev/blockdev.sh@327 -- # trap - SIGINT SIGTERM EXIT 00:32:25.681 00:32:25.681 real 0m10.412s 00:32:25.681 user 0m14.684s 00:32:25.681 sys 0m3.205s 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_nbd -- common/autotest_common.sh@10 -- # set +x 00:32:25.681 ************************************ 00:32:25.681 END TEST bdev_nbd 00:32:25.681 ************************************ 00:32:25.681 16:11:31 blockdev_crypto_qat -- bdev/blockdev.sh@763 -- # [[ y == y ]] 00:32:25.681 16:11:31 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = nvme ']' 00:32:25.681 16:11:31 blockdev_crypto_qat -- bdev/blockdev.sh@764 -- # '[' crypto_qat = gpt ']' 00:32:25.681 16:11:31 blockdev_crypto_qat -- bdev/blockdev.sh@768 -- # run_test bdev_fio fio_test_suite '' 00:32:25.681 16:11:31 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 3 -le 1 ']' 00:32:25.681 16:11:31 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:25.681 16:11:31 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:25.681 ************************************ 00:32:25.681 START TEST bdev_fio 00:32:25.681 ************************************ 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1124 -- # fio_test_suite '' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@331 -- # local env_context 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@335 -- # pushd /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:32:25.681 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@336 -- # trap 'rm -f ./*.state; popd; exit 1' SIGINT SIGTERM EXIT 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # echo '' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # sed s/--env-context=// 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@339 -- # env_context= 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@340 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio verify AIO '' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=verify 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type=AIO 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z verify ']' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' verify == verify ']' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1313 -- # cat 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1322 -- # '[' AIO == AIO ']' 00:32:25.681 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # /usr/src/fio/fio --version 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1323 -- # [[ fio-3.35 == *\f\i\o\-\3* ]] 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1324 -- # echo serialize_overlap=1 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram]' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram1]' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram1 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram2]' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram2 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@341 -- # for b in "${bdevs_name[@]}" 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@342 -- # echo '[job_crypto_ram3]' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@343 -- # echo filename=crypto_ram3 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@347 -- # local 'fio_params=--ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@349 -- # run_test bdev_fio_rw_verify fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:25.682 ************************************ 00:32:25.682 START TEST bdev_fio_rw_verify 00:32:25.682 ************************************ 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1338 -- # local sanitizers 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1340 -- # shift 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1342 -- # local asan_lib= 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libasan 00:32:25.682 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1344 -- # asan_lib= 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:25.941 16:11:31 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --spdk_mem=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:26.200 job_crypto_ram: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.200 job_crypto_ram1: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.200 job_crypto_ram2: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.200 job_crypto_ram3: (g=0): rw=randwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:26.200 fio-3.35 00:32:26.200 Starting 4 threads 00:32:41.085 00:32:41.085 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2878281: Mon Jun 10 16:11:44 2024 00:32:41.085 read: IOPS=19.2k, BW=75.2MiB/s (78.9MB/s)(752MiB/10001msec) 00:32:41.085 slat (usec): min=18, max=453, avg=71.45, stdev=31.69 00:32:41.085 clat (usec): min=22, max=1841, avg=393.51, stdev=217.75 00:32:41.085 lat (usec): min=68, max=2029, avg=464.96, stdev=229.88 00:32:41.085 clat percentiles (usec): 00:32:41.085 | 50.000th=[ 347], 99.000th=[ 955], 99.900th=[ 1106], 99.990th=[ 1385], 00:32:41.085 | 99.999th=[ 1827] 00:32:41.085 write: IOPS=21.1k, BW=82.5MiB/s (86.5MB/s)(805MiB/9758msec); 0 zone resets 00:32:41.085 slat (usec): min=27, max=295, avg=83.35, stdev=29.50 00:32:41.085 clat (usec): min=28, max=1734, avg=435.77, stdev=229.58 00:32:41.085 lat (usec): min=81, max=1856, avg=519.11, stdev=239.64 00:32:41.085 clat percentiles (usec): 00:32:41.085 | 50.000th=[ 400], 99.000th=[ 1020], 99.900th=[ 1156], 99.990th=[ 1254], 00:32:41.085 | 99.999th=[ 1598] 00:32:41.085 bw ( KiB/s): min=64624, max=117000, per=98.13%, avg=82893.47, stdev=3224.99, samples=76 00:32:41.085 iops : min=16156, max=29250, avg=20723.37, stdev=806.25, samples=76 00:32:41.085 lat (usec) : 50=0.01%, 100=1.18%, 250=27.65%, 500=39.00%, 750=22.33% 00:32:41.085 lat (usec) : 1000=8.92% 00:32:41.085 lat (msec) : 2=0.91% 00:32:41.085 cpu : usr=99.61%, sys=0.00%, ctx=66, majf=0, minf=215 00:32:41.085 IO depths : 1=6.4%, 2=26.8%, 4=53.5%, 8=13.3%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:41.085 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:41.085 complete : 0=0.0%, 4=88.2%, 8=11.8%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:41.085 issued rwts: total=192527,206080,0,0 short=0,0,0,0 dropped=0,0,0,0 00:32:41.085 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:41.085 00:32:41.085 Run status group 0 (all jobs): 00:32:41.085 READ: bw=75.2MiB/s (78.9MB/s), 75.2MiB/s-75.2MiB/s (78.9MB/s-78.9MB/s), io=752MiB (789MB), run=10001-10001msec 00:32:41.085 WRITE: bw=82.5MiB/s (86.5MB/s), 82.5MiB/s-82.5MiB/s (86.5MB/s-86.5MB/s), io=805MiB (844MB), run=9758-9758msec 00:32:41.085 00:32:41.085 real 0m13.394s 00:32:41.085 user 0m49.996s 00:32:41.085 sys 0m0.408s 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_rw_verify -- common/autotest_common.sh@10 -- # set +x 00:32:41.085 ************************************ 00:32:41.085 END TEST bdev_fio_rw_verify 00:32:41.085 ************************************ 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@350 -- # rm -f 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@351 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@354 -- # fio_config_gen /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio trim '' '' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1279 -- # local config_file=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1280 -- # local workload=trim 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1281 -- # local bdev_type= 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1282 -- # local env_context= 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1283 -- # local fio_dir=/usr/src/fio 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1285 -- # '[' -e /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio ']' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1290 -- # '[' -z trim ']' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1294 -- # '[' -n '' ']' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1298 -- # touch /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1300 -- # cat 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1312 -- # '[' trim == verify ']' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1327 -- # '[' trim == trim ']' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1328 -- # echo rw=trimwrite 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "793a42c6-5e21-5469-b370-883d1a31cdf7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "793a42c6-5e21-5469-b370-883d1a31cdf7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "edf75366-0a8b-5051-86f8-abbf31831fa3"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "edf75366-0a8b-5051-86f8-abbf31831fa3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3683e8ee-3546-5971-a425-c6af60ecfbed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3683e8ee-3546-5971-a425-c6af60ecfbed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52946918-dc89-5eaf-a2ad-219e4430e0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52946918-dc89-5eaf-a2ad-219e4430e0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@355 -- # [[ -n crypto_ram 00:32:41.085 crypto_ram1 00:32:41.085 crypto_ram2 00:32:41.085 crypto_ram3 ]] 00:32:41.085 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # jq -r 'select(.supported_io_types.unmap == true) | .name' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # printf '%s\n' '{' ' "name": "crypto_ram",' ' "aliases": [' ' "793a42c6-5e21-5469-b370-883d1a31cdf7"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "793a42c6-5e21-5469-b370-883d1a31cdf7",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc0",' ' "name": "crypto_ram",' ' "key_name": "test_dek_qat_cbc"' ' }' ' }' '}' '{' ' "name": "crypto_ram1",' ' "aliases": [' ' "edf75366-0a8b-5051-86f8-abbf31831fa3"' ' ],' ' "product_name": "crypto",' ' "block_size": 512,' ' "num_blocks": 65536,' ' "uuid": "edf75366-0a8b-5051-86f8-abbf31831fa3",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc1",' ' "name": "crypto_ram1",' ' "key_name": "test_dek_qat_xts"' ' }' ' }' '}' '{' ' "name": "crypto_ram2",' ' "aliases": [' ' "3683e8ee-3546-5971-a425-c6af60ecfbed"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "3683e8ee-3546-5971-a425-c6af60ecfbed",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc2",' ' "name": "crypto_ram2",' ' "key_name": "test_dek_qat_cbc2"' ' }' ' }' '}' '{' ' "name": "crypto_ram3",' ' "aliases": [' ' "52946918-dc89-5eaf-a2ad-219e4430e0d9"' ' ],' ' "product_name": "crypto",' ' "block_size": 4096,' ' "num_blocks": 8192,' ' "uuid": "52946918-dc89-5eaf-a2ad-219e4430e0d9",' ' "assigned_rate_limits": {' ' "rw_ios_per_sec": 0,' ' "rw_mbytes_per_sec": 0,' ' "r_mbytes_per_sec": 0,' ' "w_mbytes_per_sec": 0' ' },' ' "claimed": false,' ' "zoned": false,' ' "supported_io_types": {' ' "read": true,' ' "write": true,' ' "unmap": true,' ' "write_zeroes": true,' ' "flush": true,' ' "reset": true,' ' "compare": false,' ' "compare_and_write": false,' ' "abort": false,' ' "nvme_admin": false,' ' "nvme_io": false' ' },' ' "memory_domains": [' ' {' ' "dma_device_id": "system",' ' "dma_device_type": 1' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' },' ' {' ' "dma_device_id": "SPDK_ACCEL_DMA_DEVICE",' ' "dma_device_type": 2' ' }' ' ],' ' "driver_specific": {' ' "crypto": {' ' "base_bdev_name": "Malloc3",' ' "name": "crypto_ram3",' ' "key_name": "test_dek_qat_xts2"' ' }' ' }' '}' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram]' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram1]' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram1 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram2]' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram2 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@356 -- # for b in $(printf '%s\n' "${bdevs[@]}" | jq -r 'select(.supported_io_types.unmap == true) | .name') 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@357 -- # echo '[job_crypto_ram3]' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@358 -- # echo filename=crypto_ram3 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@367 -- # run_test bdev_fio_trim fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1100 -- # '[' 11 -le 1 ']' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:41.086 ************************************ 00:32:41.086 START TEST bdev_fio_trim 00:32:41.086 ************************************ 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1124 -- # fio_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1355 -- # fio_plugin /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1336 -- # local fio_dir=/usr/src/fio 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # sanitizers=('libasan' 'libclang_rt.asan') 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1338 -- # local sanitizers 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1339 -- # local plugin=/var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1340 -- # shift 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1342 -- # local asan_lib= 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libasan 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1343 -- # for sanitizer in "${sanitizers[@]}" 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # ldd /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # grep libclang_rt.asan 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # awk '{print $3}' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1344 -- # asan_lib= 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1345 -- # [[ -n '' ]] 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # LD_PRELOAD=' /var/jenkins/workspace/crypto-phy-autotest/spdk/build/fio/spdk_bdev' 00:32:41.086 16:11:44 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1351 -- # /usr/src/fio/fio --ioengine=spdk_bdev --iodepth=8 --bs=4k --runtime=10 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio --verify_state_save=0 --spdk_json_conf=/var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json --verify_state_save=0 --aux-path=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:32:41.086 job_crypto_ram: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:41.086 job_crypto_ram1: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:41.086 job_crypto_ram2: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:41.086 job_crypto_ram3: (g=0): rw=trimwrite, bs=(R) 4096B-4096B, (W) 4096B-4096B, (T) 4096B-4096B, ioengine=spdk_bdev, iodepth=8 00:32:41.086 fio-3.35 00:32:41.086 Starting 4 threads 00:32:53.324 00:32:53.324 job_crypto_ram: (groupid=0, jobs=4): err= 0: pid=2880664: Mon Jun 10 16:11:57 2024 00:32:53.324 write: IOPS=36.5k, BW=142MiB/s (149MB/s)(1424MiB/10001msec); 0 zone resets 00:32:53.324 slat (usec): min=19, max=1429, avg=63.90, stdev=39.54 00:32:53.324 clat (usec): min=49, max=1915, avg=231.20, stdev=137.78 00:32:53.324 lat (usec): min=77, max=1973, avg=295.10, stdev=161.83 00:32:53.324 clat percentiles (usec): 00:32:53.324 | 50.000th=[ 204], 99.000th=[ 742], 99.900th=[ 857], 99.990th=[ 938], 00:32:53.324 | 99.999th=[ 1139] 00:32:53.324 bw ( KiB/s): min=139328, max=191654, per=100.00%, avg=146226.84, stdev=3136.33, samples=76 00:32:53.324 iops : min=34832, max=47913, avg=36556.68, stdev=784.05, samples=76 00:32:53.324 trim: IOPS=36.5k, BW=142MiB/s (149MB/s)(1424MiB/10001msec); 0 zone resets 00:32:53.324 slat (usec): min=7, max=328, avg=17.30, stdev= 7.37 00:32:53.324 clat (usec): min=8, max=1973, avg=295.32, stdev=161.84 00:32:53.324 lat (usec): min=26, max=1988, avg=312.62, stdev=165.10 00:32:53.324 clat percentiles (usec): 00:32:53.324 | 50.000th=[ 258], 99.000th=[ 889], 99.900th=[ 1029], 99.990th=[ 1123], 00:32:53.324 | 99.999th=[ 1369] 00:32:53.324 bw ( KiB/s): min=139328, max=191654, per=100.00%, avg=146226.84, stdev=3136.33, samples=76 00:32:53.324 iops : min=34832, max=47913, avg=36556.68, stdev=784.05, samples=76 00:32:53.324 lat (usec) : 10=0.01%, 50=0.01%, 100=7.76%, 250=49.03%, 500=35.88% 00:32:53.324 lat (usec) : 750=5.45%, 1000=1.79% 00:32:53.324 lat (msec) : 2=0.09% 00:32:53.324 cpu : usr=99.62%, sys=0.00%, ctx=71, majf=0, minf=130 00:32:53.324 IO depths : 1=12.5%, 2=25.0%, 4=50.0%, 8=12.5%, 16=0.0%, 32=0.0%, >=64=0.0% 00:32:53.324 submit : 0=0.0%, 4=100.0%, 8=0.0%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:53.324 complete : 0=0.0%, 4=88.9%, 8=11.1%, 16=0.0%, 32=0.0%, 64=0.0%, >=64=0.0% 00:32:53.324 issued rwts: total=0,364610,364611,0 short=0,0,0,0 dropped=0,0,0,0 00:32:53.324 latency : target=0, window=0, percentile=100.00%, depth=8 00:32:53.324 00:32:53.324 Run status group 0 (all jobs): 00:32:53.324 WRITE: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1424MiB (1493MB), run=10001-10001msec 00:32:53.324 TRIM: bw=142MiB/s (149MB/s), 142MiB/s-142MiB/s (149MB/s-149MB/s), io=1424MiB (1493MB), run=10001-10001msec 00:32:53.324 00:32:53.324 real 0m13.386s 00:32:53.324 user 0m50.163s 00:32:53.324 sys 0m0.426s 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio.bdev_fio_trim -- common/autotest_common.sh@10 -- # set +x 00:32:53.324 ************************************ 00:32:53.324 END TEST bdev_fio_trim 00:32:53.324 ************************************ 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@368 -- # rm -f 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@369 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.fio 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@370 -- # popd 00:32:53.324 /var/jenkins/workspace/crypto-phy-autotest/spdk 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- bdev/blockdev.sh@371 -- # trap - SIGINT SIGTERM EXIT 00:32:53.324 00:32:53.324 real 0m27.080s 00:32:53.324 user 1m40.332s 00:32:53.324 sys 0m0.975s 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@1125 -- # xtrace_disable 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_fio -- common/autotest_common.sh@10 -- # set +x 00:32:53.324 ************************************ 00:32:53.324 END TEST bdev_fio 00:32:53.324 ************************************ 00:32:53.324 16:11:58 blockdev_crypto_qat -- bdev/blockdev.sh@775 -- # trap cleanup SIGINT SIGTERM EXIT 00:32:53.324 16:11:58 blockdev_crypto_qat -- bdev/blockdev.sh@777 -- # run_test bdev_verify /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:53.324 16:11:58 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:32:53.324 16:11:58 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:32:53.324 16:11:58 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:32:53.324 ************************************ 00:32:53.324 START TEST bdev_verify 00:32:53.324 ************************************ 00:32:53.324 16:11:58 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w verify -t 5 -C -m 0x3 '' 00:32:53.324 [2024-06-10 16:11:58.302838] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:32:53.324 [2024-06-10 16:11:58.302889] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2882283 ] 00:32:53.324 [2024-06-10 16:11:58.400074] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:32:53.324 [2024-06-10 16:11:58.499980] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:32:53.324 [2024-06-10 16:11:58.499985] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:32:53.324 [2024-06-10 16:11:58.521438] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:32:53.324 [2024-06-10 16:11:58.529468] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:32:53.324 [2024-06-10 16:11:58.537488] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:32:53.324 [2024-06-10 16:11:58.640210] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:32:55.857 [2024-06-10 16:12:00.813531] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:32:55.857 [2024-06-10 16:12:00.813613] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:32:55.857 [2024-06-10 16:12:00.813626] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:55.857 [2024-06-10 16:12:00.821542] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:32:55.857 [2024-06-10 16:12:00.821561] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:32:55.857 [2024-06-10 16:12:00.821571] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:55.857 [2024-06-10 16:12:00.829568] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:32:55.857 [2024-06-10 16:12:00.829583] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:32:55.857 [2024-06-10 16:12:00.829592] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:55.857 [2024-06-10 16:12:00.837589] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:32:55.857 [2024-06-10 16:12:00.837605] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:32:55.857 [2024-06-10 16:12:00.837614] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:32:55.857 Running I/O for 5 seconds... 00:33:01.121 00:33:01.121 Latency(us) 00:33:01.121 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:01.121 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x0 length 0x1000 00:33:01.121 crypto_ram : 5.07 437.03 1.71 0.00 0.00 291519.91 2137.72 184749.10 00:33:01.121 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x1000 length 0x1000 00:33:01.121 crypto_ram : 5.07 437.15 1.71 0.00 0.00 291439.06 2324.97 184749.10 00:33:01.121 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x0 length 0x1000 00:33:01.121 crypto_ram1 : 5.07 438.28 1.71 0.00 0.00 289847.90 3744.91 167772.16 00:33:01.121 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x1000 length 0x1000 00:33:01.121 crypto_ram1 : 5.07 438.60 1.71 0.00 0.00 289711.21 2527.82 167772.16 00:33:01.121 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x0 length 0x1000 00:33:01.121 crypto_ram2 : 5.06 3417.67 13.35 0.00 0.00 37122.01 4899.60 32705.58 00:33:01.121 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x1000 length 0x1000 00:33:01.121 crypto_ram2 : 5.05 3418.72 13.35 0.00 0.00 37106.21 5430.13 32705.58 00:33:01.121 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x0 length 0x1000 00:33:01.121 crypto_ram3 : 5.06 3415.09 13.34 0.00 0.00 37021.31 6553.60 32705.58 00:33:01.121 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 4096) 00:33:01.121 Verification LBA range: start 0x1000 length 0x1000 00:33:01.121 crypto_ram3 : 5.06 3417.48 13.35 0.00 0.00 36999.43 4712.35 32955.25 00:33:01.121 =================================================================================================================== 00:33:01.121 Total : 15420.02 60.23 0.00 0.00 65929.73 2137.72 184749.10 00:33:01.121 00:33:01.121 real 0m8.118s 00:33:01.121 user 0m15.500s 00:33:01.121 sys 0m0.294s 00:33:01.121 16:12:06 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:01.121 16:12:06 blockdev_crypto_qat.bdev_verify -- common/autotest_common.sh@10 -- # set +x 00:33:01.121 ************************************ 00:33:01.121 END TEST bdev_verify 00:33:01.121 ************************************ 00:33:01.121 16:12:06 blockdev_crypto_qat -- bdev/blockdev.sh@778 -- # run_test bdev_verify_big_io /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:01.121 16:12:06 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 16 -le 1 ']' 00:33:01.121 16:12:06 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:01.122 16:12:06 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:01.122 ************************************ 00:33:01.122 START TEST bdev_verify_big_io 00:33:01.122 ************************************ 00:33:01.122 16:12:06 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 65536 -w verify -t 5 -C -m 0x3 '' 00:33:01.122 [2024-06-10 16:12:06.491545] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:01.122 [2024-06-10 16:12:06.491596] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x3 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2883772 ] 00:33:01.122 [2024-06-10 16:12:06.588017] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 2 00:33:01.379 [2024-06-10 16:12:06.680439] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:33:01.379 [2024-06-10 16:12:06.680444] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:01.379 [2024-06-10 16:12:06.701842] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:01.379 [2024-06-10 16:12:06.709870] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:01.379 [2024-06-10 16:12:06.717887] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:01.379 [2024-06-10 16:12:06.817131] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:03.907 [2024-06-10 16:12:08.992404] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:03.907 [2024-06-10 16:12:08.992478] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:03.907 [2024-06-10 16:12:08.992491] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.907 [2024-06-10 16:12:09.000424] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:03.907 [2024-06-10 16:12:09.000442] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:03.907 [2024-06-10 16:12:09.000451] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.907 [2024-06-10 16:12:09.008447] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:03.907 [2024-06-10 16:12:09.008464] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:03.907 [2024-06-10 16:12:09.008473] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.907 [2024-06-10 16:12:09.016486] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:03.907 [2024-06-10 16:12:09.016504] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:03.907 [2024-06-10 16:12:09.016512] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:03.907 Running I/O for 5 seconds... 00:33:04.473 [2024-06-10 16:12:09.943874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.944350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.944762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.945918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.949637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.949686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.949727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.949768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.950829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.954390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.954451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.954493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.954534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.955637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.959334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.959396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.959437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.959479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.473 [2024-06-10 16:12:09.960672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.964751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.965210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.965226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.965239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.965252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.968785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.968832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.968877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.968917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.969993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.970005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.970017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.973473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.973523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.973564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.973606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.974645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.978883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.979291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.979307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.979319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.474 [2024-06-10 16:12:09.979330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.982846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.982893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.982934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.982982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.983474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.983533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.983575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.983622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.984016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.984032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.984043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.984055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.987647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.987694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.987753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.987795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.988837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.992385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.735 [2024-06-10 16:12:09.992437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.992501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.992554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.993683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.997886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.998386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.998403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.998415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:09.998430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.001737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.001785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.001839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.001881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.002485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.002531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.002573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.002614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.003074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.003090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.003105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.003117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.006417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.006476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.006517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.006560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.007646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.010767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.010817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.010858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.010915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.011467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.011513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.011555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.011598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.012062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.012078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.012092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.012105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.015951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.016547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.019724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.019770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.019810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.019858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.020942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.024929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.025386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.736 [2024-06-10 16:12:10.025403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.025415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.025428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.028489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.028543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.028587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.028632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.029810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.033749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.034215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.034232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.034244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.034257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.037414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.037460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.037506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.037548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.038751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.041826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.041872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.041913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.041953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.042461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.042506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.042549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.042594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.043026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.043042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.043055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.043067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.046902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.047258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.047276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.047288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.047300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.050415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.050461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.050505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.050547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.051666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.054713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.054759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.054801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.054841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.055911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.058873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.058919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.058966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.059032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.059495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.059541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.059595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.737 [2024-06-10 16:12:10.059636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.060077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.060093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.060105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.060117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.063321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.063393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.063445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.063499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.063994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.064578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.067435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.067482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.067524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.067564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.067990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.068680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.071676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.071734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.071777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.071847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.072938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.075864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.075911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.075952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.075999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.076492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.076539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.076580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.076623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.077089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.077105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.077119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.077131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.079721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.079766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.079812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.079865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.080678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.082637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.082690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.082734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.082769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.083845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.088018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.089654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.090649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.092397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.094315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.095947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.096598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.097024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.097491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.097507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.097520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.097533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.101648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.103363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.104316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.105701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.107647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.738 [2024-06-10 16:12:10.109205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.109627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.110054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.110455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.110471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.110483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.110495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.114404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.115457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.117265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.118875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.120813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.121527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.121950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.122378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.122886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.122901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.122916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.122928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.126657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.127567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.128949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.130581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.132477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.132904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.133333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.133754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.134211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.134231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.134246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.134261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.137300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.139068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.140660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.142350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.143362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.143785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.144216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.144640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.145106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.145122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.145134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.145150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.147895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.149285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.150920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.152555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.153362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.153783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.154212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.154635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.155026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.155042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.155053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.155065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.158780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.160414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.162113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.163894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.164773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.165202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.165620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.166047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.166342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.166357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.166368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.166379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.169645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.171288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.172930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.173938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.174872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.175303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.175726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.177050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.177380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.177395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.177406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.177418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.181082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.182820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.184455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.184879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.185756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.186184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.186769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.188161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.188455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.188470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.188480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.188491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.192014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.193645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.194512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.739 [2024-06-10 16:12:10.194934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.195829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.196257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.197643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.198991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.199292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.199306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.199318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.199330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.203004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.204732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.205162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.205583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.206421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.206944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.208402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.210038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.210332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.210346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.210357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.210369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.213899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.214857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.215307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.215728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.216638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.218083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.219448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.221093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.221390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.221405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.221416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.221428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.225007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.225434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.225859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.226293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.227448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.228819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.230447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.232078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.232371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.232386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.232400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.232412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.235148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.235577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.236005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.236427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.238425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.239805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:04.740 [2024-06-10 16:12:10.241426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.000 [2024-06-10 16:12:10.243045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.000 [2024-06-10 16:12:10.243492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.000 [2024-06-10 16:12:10.243507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.000 [2024-06-10 16:12:10.243519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.000 [2024-06-10 16:12:10.243532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.245712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.246145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.246567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.246996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.248737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.250374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.252003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.253204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.253502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.253522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.253534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.253546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.255862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.256295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.256718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.257386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.259245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.260928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.262699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.263736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.264072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.264106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.264118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.264130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.266569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.267001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.267421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.269130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.271115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.272743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.273550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.275149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.275447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.275461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.275472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.275484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.278193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.278618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.279615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.281002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.282926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.284396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.285734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.287125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.287420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.287434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.287446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.287458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.290372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.290797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.292422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.294209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.296138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.296897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.298232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.299856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.300160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.300176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.300187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.300199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.303047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.304343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.305728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.307357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.308861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.310506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.311973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.313573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.313868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.313883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.313894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.313910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.317197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.318573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.320202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.321831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.323159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.324553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.326190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.327813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.328164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.328180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.328192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.328204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.332596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.334099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.335711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.337345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.339366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.341117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.342812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.344441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.344924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.344940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.344951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.344970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.348853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.350497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.352135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.353194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.354881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.356513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.358146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.358852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.359375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.359392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.359404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.359417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.363392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.365134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.366918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.368030] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.370032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.371659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.373021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.001 [2024-06-10 16:12:10.373450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.373921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.373936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.373949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.373971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.377857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.379493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.380266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.381811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.383727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.385359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.385788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.386216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.386631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.386646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.386657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.386669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.390386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.391915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.393218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.394582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.396518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.397735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.398163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.398585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.399093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.399113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.399126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.399138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.402661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.403434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.404937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.406608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.408502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.408950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.409378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.409800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.410283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.410299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.410312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.410324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.413585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.414920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.416310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.417937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.419388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.419823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.420261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.420705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.421183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.421200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.421214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.421226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.423675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.425241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.426959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.428586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.429346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.429771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.430199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.430621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.431008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.431023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.431035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.431046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.434247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.435800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.437195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.437620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.438508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.438933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.439363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.439790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.440267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.440283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.440295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.440309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.443247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.443683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.444111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.444151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.445069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.445516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.445965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.446389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.446804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.446819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.446830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.446842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.449731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.450163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.450585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.451018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.002 [2024-06-10 16:12:10.451069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.451447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.451884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.452316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.452738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.453170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.453615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.453630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.453641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.453653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.456961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.457374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.457389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.457400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.457412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.460726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.461107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.461123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.461135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.461147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.464933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.465364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.465384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.465397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.465409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.467923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.467978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.468667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.469185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.469201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.469215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.469228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.471754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.471813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.471886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.471940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.472502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.472568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.472610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.472652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.472693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.473153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.473170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.473182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.473198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.475628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.475674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.475723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.475765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.476917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.479716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.479764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.479805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.479846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.480315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.003 [2024-06-10 16:12:10.480366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.480947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.483415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.483461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.483506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.483547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.484762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.487919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.488375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.488393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.488405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.488418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.490885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.490931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.490979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.491679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.492162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.492178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.492197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.492208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.494865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.494911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.494951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.494998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.495470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.495518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.495562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.495606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.495648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.496073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.496089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.496102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.496114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.498588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.498635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.498681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.498722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.499830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.502365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.502412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.502457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.502498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.502971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.503651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.004 [2024-06-10 16:12:10.506879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.506940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.506988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.507392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.507407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.507418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.005 [2024-06-10 16:12:10.507430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.510327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.510375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.510429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.510511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.510931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.511638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.514924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.515428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.515444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.515456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.515469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.517927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.517980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.518768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.519226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.519244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.519261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.519275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.521728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.521775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.521818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.521859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.522350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.522402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.522445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.522487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.522531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.523016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.523033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.523045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.523058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.525715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.525761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.525802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.525843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.265 [2024-06-10 16:12:10.526953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.526976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.526987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.527000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.529451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.529498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.529543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.529590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.530781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.533910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.534372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.534392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.534405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.534417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.536818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.536865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.536905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.536945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.537444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.537495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.537538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.537580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.537628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.538112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.538128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.538139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.538151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.540745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.540790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.540830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.540870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.541969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.544402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.544447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.544495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.544537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.545545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.547815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.547861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.547904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.547945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.548377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.548431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.548473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.548514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.548556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.549029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.549045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.549057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.549069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.550766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.550811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.550851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.550891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.551299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.551357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.551402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.551443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.266 [2024-06-10 16:12:10.551483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.551785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.551799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.551811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.551823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.553650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.553696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.553736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.553778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.554972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.556911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.556962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.557931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.559635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.559681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.559722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.559763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.560390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.560448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.560491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.560534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.560581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.561008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.561024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.561036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.561048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.563664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.564071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.564087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.564098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.564110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.565765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.565818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.566938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.567392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.567408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.567419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.567432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.569422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.569467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.569507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.571750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.572124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.572140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.572151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.572163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.574370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.574796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.575221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.576694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.577038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.578691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.580312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.581256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.582963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.267 [2024-06-10 16:12:10.583254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.583268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.583280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.583291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.585714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.586142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.586913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.588293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.588591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.590415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.592100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.593240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.594618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.594909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.594924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.594935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.594947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.597454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.597879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.599502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.600937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.601232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.602882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.603749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.605363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.607137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.607428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.607443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.607454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.607466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.610080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.610972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.612340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.613949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.614245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.615903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.617108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.618487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.620100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.620396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.620410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.620422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.620433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.623200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.624893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.626417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.628045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.628337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.629158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.630697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.632413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.634060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.634352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.634366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.634378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.634390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.637806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.639186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.640814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.642444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.642765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.644056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.645432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.647064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.648687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.649133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.649149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.649161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.649173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.653943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.655657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.657445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.659184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.659565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.660953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.662582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.664231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.665592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.666075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.666091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.666102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.666113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.669888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.671517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.673131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.674061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.674353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.675736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.677350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.678977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.268 [2024-06-10 16:12:10.679564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.680036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.680061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.680073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.680084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.684105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.685812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.687429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.688662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.689003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.690652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.692275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.693533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.693954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.694427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.694442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.694455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.694467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.698214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.699835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.700618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.702161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.702455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.704094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.705719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.706164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.706587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.707007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.707023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.707035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.707047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.710754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.712329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.713603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.714982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.715274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.716928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.718121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.718540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.718965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.719459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.719479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.719495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.719508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.722991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.723793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.725276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.726915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.727214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.728841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.729277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.729699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.730128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.730585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.730600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.730615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.730628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.733823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.735202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.736570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.738200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.738493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.739616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.740044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.740464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.740882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.741363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.741378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.741390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.741402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.743801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.745197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.746822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.748447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.269 [2024-06-10 16:12:10.748738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.749179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.749599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.750817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.753865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.755251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.756875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.758505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.758895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.759339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.759761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.760186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.760888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.761180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.761196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.761208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.761220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.764325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.765948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.767585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.768833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.769289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.769721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.770150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.770571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.270 [2024-06-10 16:12:10.772284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.772622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.772638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.772649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.772661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.775860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.777498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.779188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.779619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.780098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.780530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.780951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.781954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.783350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.783642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.783656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.783667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.783680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.787026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.788676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.789584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.790013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.790493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.790923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.791349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.792967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.794764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.795059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.795079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.795091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.795102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.798603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.800277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.800700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.801125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.801554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.801986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.803188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.804557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.806172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.806463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.806477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.806489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.806500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.810038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.810538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.810969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.811390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.811861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.812994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.814366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.816000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.817630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.817973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.817988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.818000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.818014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.820550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.820982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.821406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.821828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.822276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.823841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.825566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.827197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.828746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.829116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.829130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.829141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.829153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.831232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.831657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.832083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.832505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.832798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.834173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.835800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.837428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.838238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.838530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.838544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.838556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.838568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.840714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.841145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.841568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.842573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.842909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.844578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.846196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.847699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.849028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.531 [2024-06-10 16:12:10.849355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.849369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.849380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.849392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.851685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.852119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.852542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.854240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.854532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.856174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.857802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.858585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.859972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.860263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.860277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.860289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.860300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.862774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.863204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.864353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.865730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.866025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.867668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.869000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.870497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.871873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.872176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.872191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.872206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.872218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.875037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.875495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.877022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.878712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.879008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.880637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.881440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.882814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.884444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.884736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.884750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.884762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.884774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.887460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.888399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.890075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.891702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.892058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.892996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.894655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.896271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.897171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.897644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.897658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.897669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.897680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.900631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.901073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.901494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.901919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.902362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.902797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.903252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.903675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.904104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.904593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.904609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.904622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.904635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.907462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.907888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.908317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.908745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.909232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.909663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.910087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.910511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.910934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.911326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.911342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.911354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.911366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.914500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.914942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.915372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.915794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.916277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.916708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.917139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.917573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.918001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.918433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.918448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.918460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.918472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.921357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.921781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.921827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.922252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.922675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.923115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.923541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.923963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.924385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.924852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.924867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.924880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.924892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.927774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.928204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.928630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.928698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.929113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.929549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.929975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.930396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.930822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.532 [2024-06-10 16:12:10.931281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.931298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.931310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.931327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.933790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.933839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.933881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.933923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.934995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.935010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.935022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.935034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.937675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.937721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.937763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.937803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.938833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.941619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.941674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.941746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.941810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.942934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.945556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.945602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.945643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.945684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.946872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.949990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.950617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.952940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.952993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.953735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.954210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.954226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.954239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.954251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.956839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.956885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.956925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.956970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.957457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.957506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.957548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.957593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.957634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.958061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.958076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.958088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.958104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.960538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.960584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.960629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.960670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.961850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.964982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.965023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.965466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.965482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.965494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.533 [2024-06-10 16:12:10.965507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.967972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.968771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.969257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.969274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.969286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.969298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.971929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.971987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.972757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.973228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.973244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.973255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.973266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.975678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.975723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.975768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.975810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.976896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.979989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.980644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.983993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.984402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.984417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.984429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.984442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.987952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.988565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.991842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.992365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.992390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.992405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.992417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.994926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.994988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.534 [2024-06-10 16:12:10.995683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.996139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.996156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.996169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.996181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.998967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:10.999450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.002959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.003495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.005959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.006924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.008712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.008757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.008796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.008860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.009666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.012850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.013141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.013156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.013168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.013179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.014945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.015870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.018963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.019380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.021757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.022048] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.022064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.022075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.022087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.024989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.535 [2024-06-10 16:12:11.025575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.025586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.027928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.028261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.028276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.028287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.028299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.030969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.031593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.033960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.034002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.034290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.034305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.034316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.034327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.536 [2024-06-10 16:12:11.036935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.037348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.037365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.037377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.037388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.039404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.039448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.041988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.042000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.043776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.043833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.043874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.044902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.045373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.045389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.045402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.045414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.048637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.050060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.051434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.053068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.053363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.054424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.054868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.055292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.055715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.056195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.056210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.056227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.056239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.058631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.060014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.061626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.063244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.063539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.063982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.064406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.064828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.065255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.065620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.065634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.065645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.065657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.068913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.070300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.071929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.073563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.074037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.796 [2024-06-10 16:12:11.074487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.074908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.075335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.076196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.076533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.076548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.076560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.076572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.079673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.081311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.082946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.084076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.084505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.084939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.085368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.085791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.087318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.087644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.087659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.087670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.087682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.090972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.092608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.094326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.094755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.095240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.095667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.096092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.097137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.098493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.098788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.098802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.098813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.098825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.102155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.103780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.104844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.105279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.105749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.106186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.106609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.108276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.109756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.110055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.110070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.110082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.110094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.113642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.115390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.115814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.116238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.116677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.117111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.118255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.119618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.121245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.121536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.121551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.121562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.121574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.124960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.125872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.126305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.126732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.127242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.127671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.129463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.131062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.132746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.133042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.133057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.133073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.133085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.136593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.137025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.137446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.137867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.138365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.139621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.140997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.142611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.144238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.144643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.144659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.144671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.144684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.147147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.147578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.148006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.148430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.148900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.150654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.152323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.797 [2024-06-10 16:12:11.154070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.155851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.156245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.156260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.156271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.156283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.158415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.158841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.159268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.159692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.159990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.161365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.162989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.164614] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.165390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.165683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.165697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.165709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.165721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.167856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.168292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.168717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.169526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.169833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.171625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.173352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.174996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.176205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.176549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.176563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.176575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.176587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.178923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.179356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.179778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.181416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.181712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.183330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.184959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.185729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.187117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.187411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.187425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.187437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.187449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.189940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.190373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.191338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.192708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.193006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.194651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.196163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.197467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.198812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.199110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.199124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.199135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.199147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.201805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.202273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.203795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.205471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.205764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.207402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.208239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.209620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.211235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.211529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.211544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.211556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.211572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.214223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.215446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.216819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.218453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.218747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.220021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.221588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.223000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.224633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.224926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.224941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.224952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.224968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.228365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.229738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.231352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.232960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.233253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.234314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.235689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.237307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.238923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.239296] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.798 [2024-06-10 16:12:11.239312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.239324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.239337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.243215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.244599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.246229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.247853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.248250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.249995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.251562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.253203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.254914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.255407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.255422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.255433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.255444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.259202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.260835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.262472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.263638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.263934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.265312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.266931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.268573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.269359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.269864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.269882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.269895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.269908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.273529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.275166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.276800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.277602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.277900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.279647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.281439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.283119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.283550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.284024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.284040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.284053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.284065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.287830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.289456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.290376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.292059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.292354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.293979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.295589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.296201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.296625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.297084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.297099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.297111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.297123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.300970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.302620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:05.799 [2024-06-10 16:12:11.303515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.304890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.305190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.306861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.308427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.308850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.309275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.309726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.309741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.309753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.309766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.313390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.314200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.315716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.317403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.317699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.319339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.319770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.320197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.320622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.321085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.321104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.321117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.321129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.323791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.325188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.326707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.327914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.328343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.328774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.063 [2024-06-10 16:12:11.329200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.329621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.331327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.331663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.331677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.331688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.331700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.334905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.336561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.338185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.338645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.339115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.339550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.339976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.340398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.340825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.341230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.341246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.341258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.341273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.344315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.344749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.345176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.345599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.346083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.346518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.346949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.347379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.347799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.348227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.348243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.348255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.348267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.351153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.351581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.352008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.352431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.352816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.353256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.353680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.354938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.357759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.358195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.358627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.359069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.359536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.359970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.360391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.360821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.361260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.361758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.361774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.361786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.361800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.364658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.365093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.365514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.365937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.366381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.366817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.367251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.367675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.368102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.368573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.368589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.368602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.368615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.371469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.371899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.372328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.372754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.373168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.373604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.374032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.374459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.374881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.375313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.375330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.375341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.375353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.378477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.378908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.378963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.379391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.379854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.380290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.380715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.381142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.064 [2024-06-10 16:12:11.381569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.382075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.382091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.382104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.382117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.384972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.385404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.385825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.385887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.386380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.386821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.387254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.387683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.388114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.388598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.388615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.388626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.388637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391246] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.391916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.392356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.392373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.392385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.392397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.394817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.394862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.394904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.394945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.395424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.395474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.395517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.395561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.395601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.396099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.396120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.396133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.396145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.398715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.398761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.398805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.398847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399234] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.399920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.402982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.403571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.406878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.407248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.407265] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.407276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.407289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.409820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.409869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.409910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.409951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.410431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.410481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.410525] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.410579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.410622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.411131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.065 [2024-06-10 16:12:11.411148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.411160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.411171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.413789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.413836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.413878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.413920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.414980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.417676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.417723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.417767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.417831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.418989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.421750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.421811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.421865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.421907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.422998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.423010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.425536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.425583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.425624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.425676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.426796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.429987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430029] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.430620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.433944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.434392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.434407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.434419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.434431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.436890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.436935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.436980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.066 [2024-06-10 16:12:11.437988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.440484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.440530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.440602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.440644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.441829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.444721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.445016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.445031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.445042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.445053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.446826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.446870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.446914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.446968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.447871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.450975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.451421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.453820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.454168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.454184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.454197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.454208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.456878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.456925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.456976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.457824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.459598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.459644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.459688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.459728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.460531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.462997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.067 [2024-06-10 16:12:11.463043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.463721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.464092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.464108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.464119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.464131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.465878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.465922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.465968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466328] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.466839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.469898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.470212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.470227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.470238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.470254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.472946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.475794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.476267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.476283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.476295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.476309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.477976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.478679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.479010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.068 [2024-06-10 16:12:11.479025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.479036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.479047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.481798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.482322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.482338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.482350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.482363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.484813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.485109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.485124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.485136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.485148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.487838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.488226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.488241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.488253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.488264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490380] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.490844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.491198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.491213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.491226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.491243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.492964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.493009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.493435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.493483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.493965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.494576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.496621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.496667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.496707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.498922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.499216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.499232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.499244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.499256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.501541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.501977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.502402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.503633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.503974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.505623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.507253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.508481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.510077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.510404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.069 [2024-06-10 16:12:11.510418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.510430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.510441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.512890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.513349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.513949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.515332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.515623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.517261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.518964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.519914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.521302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.521596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.521610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.521621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.521633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.524208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.524636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.526223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.527620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.527912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.529561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.530477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.532157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.533919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.534225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.534242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.534255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.534268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.536939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.537832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.539213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.540829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.541126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.542749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.543998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.545374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.546997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.547290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.547305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.547317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.547329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.550217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.552024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.553655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.555373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.555667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.556484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.557884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.559517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.561145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.561438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.561453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.561465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.561477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.565037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.070 [2024-06-10 16:12:11.566419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.568054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.569676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.570060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.571621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.573025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.574641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.576272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.576695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.576709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.576723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.576735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.580759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.582386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.584020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.585362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.585697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.587078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.588639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.590258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.591244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.591697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.591712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.591724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.591737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.595427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.597074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.598683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.599490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.599797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.601609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.603293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.604901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.605342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.605803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.605819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.605833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.605846] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.609669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.611312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.612301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.614047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.614342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.615990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.617615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.618237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.618663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.619127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.619143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.619155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.619166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.623114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.624763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.625974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.627349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.627642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.629294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.630555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.330 [2024-06-10 16:12:11.630986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.631410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.631870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.631885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.631896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.631908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.635465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.636274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.637748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.639391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.639683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.641322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.641753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.642182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.642607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.643083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.643099] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.643111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.643124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.646428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.647810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.649194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.650822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.651124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.652218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.652652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.653082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.653514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.653990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.654007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.654020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.654032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.656531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.657926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.659553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.661186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.661479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.661914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.662343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.662764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.663192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.663531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.663546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.663557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.663568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.666900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.668339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.669977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.671593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.672025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.672462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.672885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.673316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.674056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.674348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.674362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.674374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.674387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.677580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.679207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.680824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.681896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.682356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.682799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.683231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.683655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.685324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.685690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.685704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.685716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.685728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.689250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.691061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.692710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.693140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.693606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.694046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.694471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.695516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.696892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.697192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.697207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.697218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.697230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.700602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.702228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.703016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.703443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.703910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.704353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.704778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.331 [2024-06-10 16:12:11.706438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.708236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.708529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.708549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.708560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.708572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.712041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.713436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.713863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.714293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.714740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.715180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.716522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.717897] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.719513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.719807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.719821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.719832] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.719845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.723250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.723802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.724232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.724656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.725154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.725729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.727139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.728759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.730391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.730683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.730697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.730708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.730720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.733770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.734216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.734646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.735092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.735557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.737057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.738431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.740064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.741696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.742178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.742194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.742205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.742218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.744344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.744771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.745202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.745627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.746055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.747698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.749338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.750697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.751838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.752139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.752155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.752166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.752177] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.754893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.755329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.757116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.758743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.759043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.760685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.761306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.763039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.764719] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.765021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.765036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.765047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.765059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.767888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.768329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.768777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.769211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.769674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.770116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.770541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.770970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.771397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.771787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.771804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.771816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.771829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.774748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.775197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.775624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.776050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.776513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.332 [2024-06-10 16:12:11.776945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.777378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.777804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.778240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.778646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.778661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.778678] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.778690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.781605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.782046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.782471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.782913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.783343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.783779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.784211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.784644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.785073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.785474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.785490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.785503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.785515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.788427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.788861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.789301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.789726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.790197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.790641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.791075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.791500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.791938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.792434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.792450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.792463] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.792477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.795383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.795813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.796249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.796673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.797103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.797538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.797975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.798399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.798821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.799315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.799332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.799344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.799358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.802369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.802803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.803235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.803663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.804091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.804523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.804947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.805379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.805807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.806250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.806266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.806277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.806289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.809279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.809714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.810146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.810828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.811349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.811778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.812215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.812649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.814206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.814734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.814751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.814763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.814777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.817552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.818071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.819552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.819982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.820435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.822157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.822586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.823017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.823451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.823853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.823867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.333 [2024-06-10 16:12:11.823878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.823890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.826431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.826860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.826926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.827361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.827759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.829255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.829680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.830460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.831685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.832175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.832192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.832213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.832226] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.834812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.835775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.836816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.836866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.334 [2024-06-10 16:12:11.837337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.837770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.838208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.839373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.840212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.840681] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.840696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.840709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.840725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.842989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843642] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.843791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.844086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.844101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.844113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.844127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.846679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.846727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.846769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.846815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.847851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.850818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.851281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.851297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.851308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.851320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.853643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.853690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.853735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.853775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.854756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.857910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.858387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.858403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.858414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.858426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.860766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.860827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.860908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.860954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.861964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.864480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.864527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.864568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.864609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865635] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.865647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.867789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.867838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.867883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.867924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.868986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.869001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.869013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.869024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.871923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.872396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.872412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.872424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.872440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.874686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.874744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.874784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.874825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.875891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.594 [2024-06-10 16:12:11.878075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.878724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.879115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.879131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.879143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.879155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.881327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.881386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.881431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.881473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.881954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882058] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882552] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.882590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.885998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.886394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.886409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.886421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.886433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.889741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.890064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.890079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.890090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.890101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.891922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.891985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.892866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.895980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896368] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.896390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.900645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.900695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.900735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.900775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.901677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.904778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.904847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.904888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.904929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905408] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.905779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.911816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.912281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.912298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.912310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.912323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.917961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.595 [2024-06-10 16:12:11.922807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.922849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.922890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.922934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.923229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.923244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.923255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.923266] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.928483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.928553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.928595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.928655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.929835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.933884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.933936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.933986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.934844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.939145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.939197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.939690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.939733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.939776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.940109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.940125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:11.940136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.018274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.018351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.018744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.022979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.023049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.024747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.025738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.027126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.027416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.027481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.029075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.029130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.030427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.030483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.030874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.030932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.031330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.031802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.031817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.031829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.031842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.035346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.036166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.037546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.039164] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.039452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.041193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.041616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.042971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.045775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.047573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.049187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.050879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.051175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.051884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.052310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.052729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.053151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.053576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.053590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.053606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.053617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.056448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.057824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.059456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.061083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.061437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.061874] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.062300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.062736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.063161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.063452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.063466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.063478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.063490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.066652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.068284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.069912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.071271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.071735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.072168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.072589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.073013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.074487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.074825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.074839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.074850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.074862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.596 [2024-06-10 16:12:12.078129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.079742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.081390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.081822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.082301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.082731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.083157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.084077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.085472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.085762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.085777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.085789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.085801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.089449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.090938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.091365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.091786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.092263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.092692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.094080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.095447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.097083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.097373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.097388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.097399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.097411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.100857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.101298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.597 [2024-06-10 16:12:12.101720] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.102145] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.102604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.103419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.104782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.106417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.108046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.108337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.108352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.108364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.108377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.111073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.111520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.111941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.112365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.112822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.114571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.116262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.118025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.119795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.120176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.120192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.120204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.120215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.122389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.122816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.123241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.123662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.123950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.125329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.126952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.128583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.129388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.129677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.129692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.858 [2024-06-10 16:12:12.129703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.129722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.131991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.132418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.132837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.134148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.134474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.136124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.137767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.138872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.140616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.140952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.140972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.140984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.140996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.143497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.143923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.144813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.146192] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.146481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.148171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.149743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.151002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.152383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.152672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.152686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.152697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.152709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.155318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.155747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.157547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.159171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.159460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.161105] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.161904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.163375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.165012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.165300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.165314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.165326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.165337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.167975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.169073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.170450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.172079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.172369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.173790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.175236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.176612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.178243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.178533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.178547] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.178559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.178571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.181415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.182965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.184685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.186460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.186841] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.188222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.189784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.191387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.192521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.192872] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.192887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.192899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.192911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.196184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.197565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.199152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.200768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.201125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.202498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.203877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.205502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.207129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.207502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.207516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.207528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.207540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.212074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.213883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.215526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.217092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.217499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.219078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.220847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.222511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.224078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.224596] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.224611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.224622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.224634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.859 [2024-06-10 16:12:12.227628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.228061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.228484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.228905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.229372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.229803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.230237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.230660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.231086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.231602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.231616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.231629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.231641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.234509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.234948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.235376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.235807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.236291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.236741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.237167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.237589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.238018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.238440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.238456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.238468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.238479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.241444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.241885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.242314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.242734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.243171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.243597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.244027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.244455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.244879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.245336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.245353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.245366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.245378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.248121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.248551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.248975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.249396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.249830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.250272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.250697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.251120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.251551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.252022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.252039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.252051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.252064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.255051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.255477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.255928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.256370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.256856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.257293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.257714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.258140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.258583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.259092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.259109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.259121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.259133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.262079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.262511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.262933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.263358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.263811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.264248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.264676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265960] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.265985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.268847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.269276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.269698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.270126] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.270510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.270942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.271370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.271788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.272215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.272625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.272641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.272652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.272665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.275560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.276003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.860 [2024-06-10 16:12:12.276436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.276485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.276991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.277422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.277843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.278273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.278708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.279158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.279174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.279185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.279198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.282136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.282195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.282613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.282658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.283143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.283570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.283615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.284510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.287374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.287425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.287851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.287900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.288349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.288810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.288857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.289842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.293521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.293584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.294018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.294072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.294548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.294981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.295996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.298910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.298969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.299392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.299442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.299914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.300355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.300406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.300825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.300871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.301307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.301323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.301334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.301346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.304221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.304272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.304690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.304735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.305213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.305645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.305695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306123] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.306697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.309650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.309703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.310141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.310188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.310665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.311962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.314733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.314788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.315212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.861 [2024-06-10 16:12:12.315257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.315639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.316079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.316131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.316549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.316597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.317072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.317088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.317100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.317114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.319951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.320009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.320051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.320094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.320565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.321015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.321061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.321710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.321756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.322067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.322083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.322095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.322107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.323871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.323915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.323962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.324840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.327952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.328485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.330814] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.331109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.331129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.331140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.331152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.333996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.334038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.334498] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.334515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.334527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.334539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336278] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.336967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.337008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.337361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.337377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.337388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.337401] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.339245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.339290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.339334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.339377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.862 [2024-06-10 16:12:12.339844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.863 [2024-06-10 16:12:12.339893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.863 [2024-06-10 16:12:12.339937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.863 [2024-06-10 16:12:12.339987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.340039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.340504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.340520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.340533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.340546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342457] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.342980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.343022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.343373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.343388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.343400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.343412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.345973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.346427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.346442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.346453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.346467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.348676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.348737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.348777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.348818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.349650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.351970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.352608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.354888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.354933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.354978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.355810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.357592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.357637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:06.864 [2024-06-10 16:12:12.358532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.509260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.509339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.509736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.509788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.510187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.510605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.512615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.514224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.515517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.517047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.518421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.518716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.518782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.520394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.520451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.521180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.521235] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.521627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.522055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.522530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.522546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.522558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.522571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.526129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.527223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.528613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.530236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.530531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.531945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.532374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.532795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.533219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.533687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.533702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.533714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.533727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.536302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.537990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.539769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.541531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.541825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.542377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.542800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.543225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.543647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.544220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.544237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.544248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.544260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.547736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.549362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.551083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.552886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.553300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.553735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.554163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.554584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.124 [2024-06-10 16:12:12.555709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.556052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.556067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.556078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.556090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.559172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.560762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.562388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.563321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.563793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.564227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.564648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.565076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.566762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.567062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.567078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.567089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.567101] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.570507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.572323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.574026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.574447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.574900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.575334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.575755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.576912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.578295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.578586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.578600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.578612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.578623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.581925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.583551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.584452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.584875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.585354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.585787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.586212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.587831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.589634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.589929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.589944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.589966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.589978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.593426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.595024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.595449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.595873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.596346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.596775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.597724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.599311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.600932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.601273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.601288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.601300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.601312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.603502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.603928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.604352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.605024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.605319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.606943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.608657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.610475] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.611543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.611879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.611893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.611905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.611916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.614175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.614600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.615025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.616680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.125 [2024-06-10 16:12:12.617061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.618733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.620361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.621152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.622838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.623207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.623222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.623233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.623244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.625578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.626007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.626428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.626857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.627277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.627711] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.628135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.628554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.628981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.629427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.629442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.629453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.629466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.126 [2024-06-10 16:12:12.632403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.632838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.633269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.633318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.633778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.634213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.634634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.635067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.635497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.635970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.635985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.635998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.636011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.638809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.638862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.639290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.639338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.639813] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.640262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.640309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.640733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.640790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.641317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.641333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.641344] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.641356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.644395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.644827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.644877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.645302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.645775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.646207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.646253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.646673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.647103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.647491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.647506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.647519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.647536] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.650700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.650754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.651184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.651608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.652083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.652161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.652583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653492] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.653505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.656179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.656624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.657060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.657112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.657578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.658009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.658431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.658478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.658895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.659316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.659334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.659346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.659358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.662224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.662652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.662708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.663144] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.663650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.664084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.664132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.664550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.664980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.665442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.665458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.665470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.665482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.668485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.668546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.668973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.669400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.669881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.669935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.670358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.670780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.670825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.671203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.671219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.388 [2024-06-10 16:12:12.671232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.671244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.673683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.674115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.674558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.674613] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.675005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.675442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.675862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.675908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.676336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.676780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.676795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.676808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.676820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.679765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.680201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.680258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.680680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.681140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.681570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.681618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.682871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.685777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.685828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.686257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.686688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.687181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.687237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.687655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.688644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.691130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.691554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.691985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.692040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.692556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.692995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.693419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.693467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.693885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.694367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.694382] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.694394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.694407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.697161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.697586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.697633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.698062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.698459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.698890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.698939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.699361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.699794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.700293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.700308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.700324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.700337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.703076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.703501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.703545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.703976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.704472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.704531] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.704977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.705910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.708289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.708712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.708756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.709179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.709636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.709687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.710578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.710627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.711544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.711870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.711884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.389 [2024-06-10 16:12:12.711896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.711907] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.714358] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.714786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.714837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.715291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.715774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.715835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.716260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.716305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.716723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.717200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.717216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.717228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.717242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.719587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.720018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.720063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.720482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.720847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.720906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.721335] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.721395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.723155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.723660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.723675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.723686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.723698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.726109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.727774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.727822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.729486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.729779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.729831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.731587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.731641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.732819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.733181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.733196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.733208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.733219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.735093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.735519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.735564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.735995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.736460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.736510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.738229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.738282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.739986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.740280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.740295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.740306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.740318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.742107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.743748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.743795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.745176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.745558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.745612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.746548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.746594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.747017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.747314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.747329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.747340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.747352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.749470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.751107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.751155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.752775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.753262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.753327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.754875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.754925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.756548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.756840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.756854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.756866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.756878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.759153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.759578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.759621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.761051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.761392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.761451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.763084] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.763131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.764752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.765128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.765143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.765155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.390 [2024-06-10 16:12:12.765168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.766821] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.766867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.766906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.766961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.767351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.767404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.768667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.768713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.768752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.769209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.769228] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.769241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.769253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771844] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.771987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.772027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.772314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.772329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.772340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.772351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.774750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.775044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.775060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.775072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.775083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.777684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.777735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.777777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.777820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778261] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.778634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.780990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781032] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781385] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.781397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.783660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.783706] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.783757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.783798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.784812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.786623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.786672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.786712] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.786751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.787553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.789333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.789394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.391 [2024-06-10 16:12:12.789974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.790019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.790061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.790491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.790508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.790519] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.792754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.801669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.802151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.804962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.805021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.805414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.805805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.807153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.807197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.807248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.808850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.809148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.811194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.811245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.811297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.812431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.812787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.812842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.813405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.813451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.813929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.816071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.817419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.819469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.820247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.822713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.822808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.823208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.824615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.828066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.828119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.830111] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.830155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.833550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.833602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.833643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.835517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.835565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.835605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.835662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.837279] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.837572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.839378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.840991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.841045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.841089] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.841140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.842569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.842987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.843581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.844066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.844081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.845996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.846041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.847667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.847715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.848122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.848138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.848197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.848251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.849604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.849650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.849690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.849983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.392 [2024-06-10 16:12:12.849998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.850010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.851772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.852202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.852247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.852291] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.852583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.852598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.853389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.853436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.853477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.853896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.854193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.854208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.854220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.857657] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.857707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.857752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.859379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.859673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.859688] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.859743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.859788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.861298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.861345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.861775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.861790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.861802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.864302] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.864348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.865710] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.865759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.866056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.866071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.866125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.867738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.867785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.867824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.868247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.868264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.868276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.870593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.870645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.871812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.871857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.872310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.872327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.873211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.873259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.874207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.874252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.874717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.874733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.874746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.880911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.880978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.882598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.882647] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.882943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.882963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.883527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.883573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.884000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.884046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.884481] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.884496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.884509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.888158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.888210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.889445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.889490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.889783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.889798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.891188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.891236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.892860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.892906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.893204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.893220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.393 [2024-06-10 16:12:12.893231] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.897202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.897258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.898590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.898637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.898928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.898943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.900601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.900649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.901974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.902020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.902311] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.902326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.902337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.904470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.904520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.904938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.904989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.905477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.905493] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.905915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.905968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.907418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.907465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.907754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.907769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.907781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.914182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.914237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.915563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.915609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.916036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.916052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.916476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.916521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.918056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.918102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.918559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.918575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.918587] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.922085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.922136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.922910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.922961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.923254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.923271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.925019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.925080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.926716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.926765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.654 [2024-06-10 16:12:12.927060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.927080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.927092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.931638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.931694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.933320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.933367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.933762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.933776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.935583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.935638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.937453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.937503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.937789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.937805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.937816] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.940643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.940694] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.941755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.941800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.942280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.942297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.943415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.943462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.944842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.944889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.945182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.945198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.945209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.950819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.951255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.951677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.952106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.952518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.952533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.953905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.953953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.955559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.957184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.957474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.957489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.957501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.960522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.961935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.962522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.962942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.963237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.963252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.963945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.964400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.966132] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.967819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.968118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.968142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.968154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.973669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.974107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.974528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.974951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.975257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.975273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.976644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.978281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.979905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.980708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.981005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.981020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.981031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.984522] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.984949] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.985517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.986948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.987477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.987494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.988109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.989483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.991117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.992742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.993038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.993059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.993071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.998267] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.998696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:12.999124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.000771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.001070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.001085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.002703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.004326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.005118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.006494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.006785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.655 [2024-06-10 16:12:13.006800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.006811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.009036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.009917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.011046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.011470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.011837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.011852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.013229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.014743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.016065] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.017583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.017913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.017928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.017939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.024225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.025884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.027666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.029333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.029743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.029758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.031153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.032759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.034405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.035651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.035961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.035976] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.035987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.038520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.039896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.041275] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.042902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.043201] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.043216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.044262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.046053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.047684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.049424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.049716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.049730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.049742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.054609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.056254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.057521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.058748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.059094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.059110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.060785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.062424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.063571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.064786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.065168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.065184] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.065195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.067968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.068399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.068826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.069252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.069758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.069774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.070210] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.070637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.071211] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.072634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.073122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.073140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.073152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.077021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.077452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.077873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.077921] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.078299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.078315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.078748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.080047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.080748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.081175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.081473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.081488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.081503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.084295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.084721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.084777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.085203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.085674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.085689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.086127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.086570] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.086618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.088011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.088505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.088521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.656 [2024-06-10 16:12:13.088534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.092427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.092484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.092902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.093339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.093728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.093743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.094186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.094238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.095673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.096230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.096717] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.096732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.096745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.099341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.099396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.099820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.099869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.100345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.100362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.100409] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.100845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.100889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.101320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.101725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.101740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.101752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.105544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.105608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.106059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.106116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.106615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.106631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.107063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.107110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.107527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.107571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.108038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.108054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.108066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.110566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.111725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.112560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.112608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.113094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.113113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.113541] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.113592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.114025] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.114452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.114924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.114939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.114952] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.118467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.120292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.120716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.120762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.121216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.121232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.121661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.121714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.122161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.122585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.123007] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.123022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.123033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.126927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.127359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.128208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.128258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.128568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.128584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.129021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.129068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.129489] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.129914] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.130384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.130400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.130412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.134461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.134893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.136386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.136436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.136926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.136942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.137379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.137425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.137852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.138286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.138781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.138797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.138810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.657 [2024-06-10 16:12:13.141679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.143467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.143526] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.143945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.144406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.144422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.146196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.146245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.146662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.147093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.147637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.147652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.147664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.151603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.151660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.152643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.153070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.153415] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.153430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.153483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.154361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.154784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.154829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.155243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.155259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.155271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.157759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.158194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.158624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.158679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.159086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.159102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.160812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.161252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.658 [2024-06-10 16:12:13.161300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.161923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.162223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.162238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.162251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.165995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.166428] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.166485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.166906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.167274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.167289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.168509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.168557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.168979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.170294] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.170687] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.170702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.170713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.173148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.173573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.173619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.174047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.174521] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.174537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.174585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.175805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.178753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.920 [2024-06-10 16:12:13.179203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.179257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.180809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.181216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.181232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.181284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.181703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.181748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.182173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.182654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.182670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.182684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.184852] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.185285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.185332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.186755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.187172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.187187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.187242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.187664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.187709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.188160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.188599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.188615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.188627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.194695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.195141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.195193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.195850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.196152] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.196167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.196224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.196645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.196690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.197593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.197889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.197903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.197915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.200046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.201677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.201726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.202989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.203301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.203319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.203371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.204746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.204794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.206418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.206715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.206729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.206740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.210051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.210671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.210718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.212086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.212378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.212395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.212452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.214095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.214143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.215451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.215790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.215805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.215817] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.217532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.218978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.219026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.219452] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.219913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.219928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.219984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.221670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.221723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.222159] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.222640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.222655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.222666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.227738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.229372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.229420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.231050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.231348] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.231364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.231422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.232778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.232824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.233316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.233794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.233810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.233822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.235909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.921 [2024-06-10 16:12:13.237317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.237365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.239009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.239301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.239315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.239369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.240161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.240209] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.241574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.241864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.241879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.241890] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.246670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.246724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.246765] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.246805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.247820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.248115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.248131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.248142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.249892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.249937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.249982] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250023] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.250910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.255947] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.256605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258260] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258313] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258397] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.258916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.259244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.259259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.259271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.264923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.264987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265424] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.265550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.266017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.266035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.266047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.267802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.267848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.267887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.267927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268783] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.268794] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.272975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.273027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.922 [2024-06-10 16:12:13.273068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.273672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.274075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.274092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.274103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276133] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276222] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276578] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.276701] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.277114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.277129] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.277140] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282175] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.282882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.283181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.283197] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.283208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.285968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.286013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.286304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.286318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.286330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.290881] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.290931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.290978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.291450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.291929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.291945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.292462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.294759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.294807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296540] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.296995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.298227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.298274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.298621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.298640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.298651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.304169] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.305882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.305940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.305990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.306517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.306532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.306579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.307664] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.309413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.310789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.923 [2024-06-10 16:12:13.310838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.312461] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.312756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.312771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.313979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.314027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.315742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.315797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.316342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.316362] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.316374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.319224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.321022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.321082] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.322619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.323001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.323016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.323071] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.324441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.324488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.326098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.326391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.326406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.326417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.328649] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.328695] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.328751] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.330527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331110] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331923] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.331950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.336270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.336320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.336360] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.337995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.338366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.338381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.338436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.339967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.340014] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.340060] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.340583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.340600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.340612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.343098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.343148] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.343187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.344561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.344856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.344871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.344927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.346615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.346662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.346702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.346997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.347013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.347026] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.351759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.351808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.351848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.352283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.352745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.352761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.352810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.354557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.354612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.354652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.355134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.355150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.355167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.356995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.357045] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.358810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.358856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.359241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.359256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.359306] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.360680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.360727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.360766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.361064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.361080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.361091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.365315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.365818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.365867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.365909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.924 [2024-06-10 16:12:13.366395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.366412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.368182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.368227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.368274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.369885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.370187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.370202] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.370213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.373600] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.373652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.373693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.374899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375208] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.375983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.376446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.376462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.376478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.382075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.382125] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.383805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.383860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.384251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.384280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.384332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.385702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.385749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.385789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.386086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.386102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.386114] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.391174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.391230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.391650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.391697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.391995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.392011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.393466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.393514] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.395232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.395301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.395593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.395608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.395619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401124] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.401906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.402752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.402799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.403223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.403284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.403577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.403592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.403604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.409176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.409238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.410892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.410936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.411299] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.411314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.412538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.412585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.413009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.413054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.413375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.413390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.413406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.418660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.418714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.925 [2024-06-10 16:12:13.420471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.420516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.420811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.420825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.422446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.422495] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.424173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.424229] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.424619] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.424633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:07.926 [2024-06-10 16:12:13.424645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.430980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.431036] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.432660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.432707] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.433003] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.433019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.434396] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.434443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.436062] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.436118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.436423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.436438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.436450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.443709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.443762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.444186] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.444230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.444638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.444652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.446059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.446108] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.447726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.447772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.448068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.448083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.448095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.454237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.454292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.454716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.454762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.455233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.455249] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.456939] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.456997] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.457420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.457464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.457909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.457924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.186 [2024-06-10 16:12:13.457934] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.464442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.464497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.466119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.466165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.466505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.466523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.467754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.467802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.468379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.468426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.468883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.468898] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.468911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.476072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.476128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.476908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.476954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.477282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.477298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.479067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.479115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.480726] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.480772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.481068] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.481083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.481095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.487011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.488384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.490011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.491639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.492051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.492070] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.493702] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.493759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.495406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.497146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.497439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.497454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.497465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.501901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.503283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.504909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.506479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.506774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.506789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.507920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.509304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.510929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.512561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.512903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.512918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.512930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.520282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.521705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.523343] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.524970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.525379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.525393] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.527013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.528791] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.530472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.532073] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.532482] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.532497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.532508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.538443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.540063] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.541672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.542792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.543100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.543116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.544494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.546106] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.547724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.548450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.548745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.548759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.548771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.553269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.554899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.555361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.556988] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.557284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.557298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.559044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.559918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.561053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.561473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.561836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.561851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.561862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.187 [2024-06-10 16:12:13.567680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.569011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.570646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.572277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.572574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.572589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.573414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.574580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.575004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.576167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.576530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.576545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.576556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.581884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.583269] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.584895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.586523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.587005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.587019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.588838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.589271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.589782] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.591270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.591759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.591775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.591788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.597474] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.599128] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.600755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.602120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.602505] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.602520] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.603967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.604389] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.605247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.606373] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.606828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.606843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.606855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.610989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.611423] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.611849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.612285] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.612584] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.612599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.613336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.613760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.615346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.615776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.616254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.616270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.616283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.619836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.620271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.620704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.621137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.621430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.621445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.621945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.622370] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.624173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.624595] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.625067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.625083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.625096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.628312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.628744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.629176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.629245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.629662] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.629683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.631384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.631806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.632448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.633811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.634310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.634325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.634337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.639906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.640338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.640387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.640807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.641292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.641308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.642059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.643315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.643361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.643778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.644139] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.644154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.644165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.648195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.188 [2024-06-10 16:12:13.648250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.649500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.650241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.650703] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.650718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.651154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.651206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.651627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.653241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.653753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.653768] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.653781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.658256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.658312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.658739] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.658784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.659254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.659270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.659324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.660966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.661013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.661430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.661851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.661866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.661878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.665087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.665143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.665564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.665620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.666130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.666146] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.666912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.666967] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.668049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.668093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.668557] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.668572] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.668585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.673421] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.673858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.675149] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.675199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.675630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.675646] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.676085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.676134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.676555] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.676986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.677282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.677298] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.677309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.680979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.682407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.682990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.683039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.683518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.683537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.684903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.684954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.685410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.685833] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.686274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.686289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.686301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.689510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.689943] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.690386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.690444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.690740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.690755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.691289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.691338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.691760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.693465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.693986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.694002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.189 [2024-06-10 16:12:13.694015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.700325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.700769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.701200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.701263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.701677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.701692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.702130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.702183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.703729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.704158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.704594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.704609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.704621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.709316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.709748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.709798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.711263] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.711785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.711800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.712236] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.712295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.712724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.713469] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.713770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.713785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.713796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.717182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.717238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.450 [2024-06-10 16:12:13.718580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.719006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.719412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.719426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.719483] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.720855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.721282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.721327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.721743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.721758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.721770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.724545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.724984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.725410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.725458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.725864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.725879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.726704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.727884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.727930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.728355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.728699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.728714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.728725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.732671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.733610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.733665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.734534] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.735005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.735021] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.735448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.735497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.735918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.737193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.737580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.737594] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.737607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.740376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.741242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.741293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.742230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.742530] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.742544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.742602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743636] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743946] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.743962] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.747038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.747467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.747524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.749160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.749671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.749686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.749735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.750170] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.750245] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.750674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.751115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.751130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.751142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.756728] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.757174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.757233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758432] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.758961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.760087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.760453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.760468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.760480] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.764627] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.766010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.766057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.766513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.766994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.767010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.767059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.451 [2024-06-10 16:12:13.768749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.768802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.770485] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.770781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.770800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.770812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.776002] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.777074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.777121] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.777891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.778359] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.778375] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.778426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.779676] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.779722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.780304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.780769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.780785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.780797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.785884] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.787673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.787727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.789357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.789651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.789665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.789722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.790723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.790769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.791610] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.792072] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.792088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.792100] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.796671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.798282] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.798329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.799109] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.799403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.799417] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.799476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.801118] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.801179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.802836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.803135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.803150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.803161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.807190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.808284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.808332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.809704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.810252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.810271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.810326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.811963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.812016] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.813054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.813387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.813402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.813413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.818309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.819612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.819660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.820195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.820665] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.820682] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.820731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.822350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.822405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.824010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.824305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.824319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.824331] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.829974] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.830024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.830868] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.830915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.830963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.831418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.831434] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.831446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.836769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.452 [2024-06-10 16:12:13.836819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.836859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.836908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.837786] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842205] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842867] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.842953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.843252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.843268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.843280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847179] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847230] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847630] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847734] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847776] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.847818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.848158] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.848174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.848185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.853899] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.853951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.853996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854338] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854528] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.854990] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.855006] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.855018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.859689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.859738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.859789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.859830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.860643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.864700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.864749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.864790] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.864834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865793] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865807] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.865819] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.869746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.869805] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.869853] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.869893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.870189] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.870204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.453 [2024-06-10 16:12:13.870258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870745] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.870774] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.876621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.876675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.876716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.876755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877214] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.877577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882386] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882436] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882516] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882917] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.882992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.883441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.886223] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.886272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.886312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.887938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888392] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888843] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888859] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.888870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.893674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.893723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894150] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894563] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.894604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.895195] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.895242] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.895709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.895724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.895738] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.900959] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.902585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.902632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.902672] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.902970] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.902986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.903040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904440] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904501] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.904866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.907972] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.909512] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.909559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.911339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.911638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.911652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.912479] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.912529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.913911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.913966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.914258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.914273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.454 [2024-06-10 16:12:13.914283] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.918599] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.919232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.919280] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.919699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.919996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.920011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.920064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.921597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.921643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.923412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.923709] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.923723] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.923735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.928155] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.928206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.928247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.928727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.929196] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.929212] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.929259] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.930762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.930808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.930858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.931345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.931363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.931376] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.935554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.935604] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.935644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.937010] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.937305] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.937320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.937374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939504] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.939515] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.942588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.942645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.942691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.944322] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.944616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.944631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.944686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.945829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.945876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.945916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.946262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.946277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.946288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.950561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.950612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.950654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.951488] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.951829] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.951849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.951906] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953629] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.455 [2024-06-10 16:12:13.953948] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.716 [2024-06-10 16:12:13.958653] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.958708] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.959135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.959181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.959606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.959620] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.959668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960090] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960546] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.960573] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.964808] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.966450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.966497] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.966537] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.966830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.966847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.967523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.967571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.967612] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.968043] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.968451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.968466] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.968478] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.973699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.973756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.973797] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.975355] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.975652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.975666] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.975722] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.975764] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.977405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.977459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.977865] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.977880] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.977892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.981616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.981667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.983281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.983326] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.983699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.983715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.983770] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985944] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.985978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.991323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.991384] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.992753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.992801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.993098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.993113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.994755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.994803] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.995804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.995857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.996156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.996171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:13.996182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.000848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.000904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.001792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.001838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.002174] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.002190] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.003836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.003883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.005508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.005554] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.005895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.005910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.005922] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.011567] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.011631] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.012057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.012103] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.012523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.012538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.013999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.014047] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.015671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.015718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.016013] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.016028] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.016039] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.717 [2024-06-10 16:12:14.021281] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.021336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.021755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.021799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.022308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.022324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.022748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.022796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.024277] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.024323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.024611] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.024628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.024640] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.030926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.030992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.031413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.031458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.031909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.031925] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.032357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.032406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.032823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.032870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.033173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.033188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.033200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.039000] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.039057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.040425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.040471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.040909] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.040924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.041354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.041400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.041818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.041862] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.042336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.042352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.042365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.049130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.049193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.050854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.050902] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.051200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.051215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.051645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.051690] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.052116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.052161] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.052607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.052622] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.052633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.058097] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.058160] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.059863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.059920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.060217] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.060232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.061864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.061919] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.062345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.062388] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.062831] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.062847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.062863] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.067857] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.067913] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.069286] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.069333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.069624] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.069638] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.070639] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.070704] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.071131] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.071176] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.071628] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.071643] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.071655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.076377] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.077754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.079399] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.081019] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.081318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.081332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.081771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.081820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.082248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.082668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.083147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.083167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.083180] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.718 [2024-06-10 16:12:14.089840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.091544] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.093324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.093755] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.094238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.094254] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.094683] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.095112] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.095993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.097365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.097658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.097673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.097684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.103729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.104166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.104588] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.105031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.105490] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.105506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.107135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.108593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.110243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.111878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.112325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.112353] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.112365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.118137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.118565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.118996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.119427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.119861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.119876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.120309] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.120730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.121156] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.121581] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.121973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.121989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.122001] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.125718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.126153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.126583] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.127009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.127426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.127441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.127876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.128320] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.128740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.129166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.129633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.129648] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.129659] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.132883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.133340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.133772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.134198] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.134637] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.134652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.135083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.135507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.135933] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.136357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.136834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.136849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.136861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.140165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.140597] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.141022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.141462] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.141879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.141894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.142332] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.142754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.143200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.143625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.144049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.144064] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.144076] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.146993] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.147431] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.147861] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.148289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.148721] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.148737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.149168] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.149602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.150035] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.150467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.150941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.150963] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.150979] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.153769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.154200] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.154621] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.719 [2024-06-10 16:12:14.155044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.155444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.155460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.155891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.156324] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.156746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.157173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.157644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.157660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.157674] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.160503] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.160928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.161357] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.161800] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.162292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.162308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.162736] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.163162] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.163585] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.164020] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.164418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.164437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.164449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.167973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.168406] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.168828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.168875] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.169314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.169330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.169756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.170185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.170633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.171079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.171560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.171576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.171589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.174418] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.174847] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.174893] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.175318] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.175784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.175799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.176243] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.176669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.176718] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.177143] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.177626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.177641] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.177654] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.180499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.180550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.180971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.181412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.181903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.181918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.182350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.182400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.182825] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.183257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.183727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.183742] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.183754] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.186699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.186750] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.187173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.187216] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.187685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.187700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.187749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.188181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.188239] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.188661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.189120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.189136] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.189147] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.191998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.192051] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.192470] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.192513] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.192981] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.192998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.193420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.193465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.193886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.193930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.194356] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.194372] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.194383] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.197215] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.198580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.199207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.720 [2024-06-10 16:12:14.199256] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.199550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.199564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.200193] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.200244] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.200663] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.201087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.201562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.201579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.201591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.204548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.204977] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.205398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.205447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.205845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.205860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.206297] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.206345] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.206762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.207188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.207652] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.207667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.207685] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.210414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.210837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.211264] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.211321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.211748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.211763] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.212203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.212252] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.212669] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.213093] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.213550] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.213565] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.213576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.215986] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.217369] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.218996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.219044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.219336] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.219352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.220987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.221033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.221450] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.221869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.222315] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.222330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.721 [2024-06-10 16:12:14.222342] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.225870] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.226912] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.226983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.228582] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.228886] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.228901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.230539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.230589] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.232317] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.232748] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.233224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.233240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.233253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.237104] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.237154] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.238778] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.240052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.240367] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.240381] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.240433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.241810] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.243448] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.243494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.243785] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.243799] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.243811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.246255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.246773] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.248240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.248287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.248577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.248592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.250346] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.252116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.252166] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.253416] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.253746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.253760] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.253771] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.256107] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.256533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.256579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.257055] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.257350] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.257365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.258842] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.258891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.260616] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.262420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.984 [2024-06-10 16:12:14.262834] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.262848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.262860] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.264500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.264928] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.264978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.265398] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.265876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.265891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.265937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.266361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.266405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.268117] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.268446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.268460] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.268471] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.270248] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.271908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.271966] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.273632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.273926] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.273940] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.273995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.274420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.274464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.274877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.275312] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.275327] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.275339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.277400] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.279033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.279080] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.280054] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.280349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.280364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.280414] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.281818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.281864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.283511] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.283804] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.283818] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.283830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.286349] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.287167] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.287213] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.288586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.288882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.288896] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.288951] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.290586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.290633] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.291733] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.292031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.292046] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.292057] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.293822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.294262] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.294307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.294727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.295224] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.295240] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.295287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.295864] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.295910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.297274] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.297566] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.297580] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.297591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.299445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.301083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.301130] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.302752] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.303081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.303096] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.303153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.303579] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.303625] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.304056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.304558] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.304574] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.304586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.306430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.308067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.308113] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.308965] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.309292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.309307] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.309365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.310992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.985 [2024-06-10 16:12:14.311038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.312661] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.312954] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.312973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.312985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.315561] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.316994] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.317041] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.318455] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.318746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.318761] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.318815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.320454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.320502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.321293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.321586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.321601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.321615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.323420] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.323850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.323894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.324321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.324811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.324828] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.324876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.326191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.326237] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.327602] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.327894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.327908] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.327920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.329686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.329732] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.329775] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.329826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.330120] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.330135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.330181] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.331882] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.331937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.331984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.332412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.332427] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.332439] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.334806] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.334851] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.334891] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.334931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335276] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335329] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335371] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335453] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335756] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.335767] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337553] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337601] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337692] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337983] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.337998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338102] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338142] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338183] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338472] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.338499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341227] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341517] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341532] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341576] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341626] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341667] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.341714] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.342015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.342031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.342042] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.343918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.343968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344050] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344337] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344404] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344446] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.986 [2024-06-10 16:12:14.344487] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.344527] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.344812] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.344827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.344839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347247] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347840] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347855] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347904] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347945] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.347992] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.348033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.348419] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.348435] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.348445] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350204] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350253] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350293] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350333] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350700] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350753] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350795] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.350876] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.351172] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.351188] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.351199] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.353691] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.353737] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.353781] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.353822] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354303] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354365] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354407] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354449] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354494] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354858] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354873] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.354883] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356605] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356645] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356686] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.356995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357094] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357141] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357182] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357476] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357491] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.357502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.359502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.359551] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.359592] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.359634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360049] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360095] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360137] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360178] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360220] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360677] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360693] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.360705] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.362412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.362456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.362496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363390] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363684] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363796] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.363887] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.364191] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.364207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.364218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.366410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.366456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.366877] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.366920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.367391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.367410] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.367456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.367499] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.368910] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.368961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.987 [2024-06-10 16:12:14.369289] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.369304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.369316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.371053] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372529] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372575] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372623] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372930] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.372984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.374680] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.374735] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.374777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.375206] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.375221] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.375233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.377615] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.378989] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.379038] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.380660] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.380953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.380973] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.382157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.382203] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.383961] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.384004] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.384295] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.384310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.384321] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.386314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.386740] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.386784] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.387207] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.387603] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.387617] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.387671] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.389044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.389092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.390697] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.390996] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.391011] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.391022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.392827] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.392871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.392911] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.394539] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.394927] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.394942] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395022] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395451] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395496] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395538] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395971] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395987] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.395998] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.398077] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.398122] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.398163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.399779] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.400083] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.400098] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.400153] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.401523] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.401569] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.401609] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.401980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.401995] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.402005] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.403978] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.404024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.404066] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.404486] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405033] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405802] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405848] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.405889] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.406219] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.406238] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.406250] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.408008] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.408052] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.408092] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.409468] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.988 [2024-06-10 16:12:14.409766] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.409780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.409835] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.411456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.411502] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.411542] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.412040] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.412056] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.412067] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.414892] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.414937] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.416316] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.416364] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.416655] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.416670] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.416724] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418354] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418403] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418443] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418772] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418787] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.418801] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.420465] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.421458] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.421524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.421571] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.422069] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.422085] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.422510] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.422556] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.422598] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.423024] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.423433] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.423447] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.423459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.425885] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.425936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.425984] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.427363] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.427658] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.427673] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.427727] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.427769] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.429391] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.429437] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.429809] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.429824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.429839] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.432543] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.432607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.434233] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.434301] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.434591] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.434606] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.434651] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436430] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436484] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436524] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436811] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436826] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.436838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.440087] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.440138] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.440560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.440607] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.441075] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.441091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.441518] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.441564] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.441991] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.442037] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.442325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.442341] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.442352] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.989 [2024-06-10 16:12:14.445699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.445757] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.447378] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.447426] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.447715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.447730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.449459] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.449506] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.449924] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.449975] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.450441] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.450456] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.450473] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.454218] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.454270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.455888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.455936] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.456325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.456340] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.458119] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.458173] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.459964] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.460018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.460304] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.460319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.460330] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.462953] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.463009] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.463788] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.463837] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.464135] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.464151] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.465789] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.465838] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.466968] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.467015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.467387] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.467402] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.467413] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.469759] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.469824] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.470255] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.470308] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.470743] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.470758] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.472290] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.472339] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.473916] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.473969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.474258] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.474273] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.474284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.477689] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.477741] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.478854] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.478900] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.479394] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.479411] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.479836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.479894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.480319] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.480366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.480830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.480845] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.480856] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.483271] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.483323] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.484699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.484746] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.485044] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.485059] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.486699] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.486747] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.488031] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.488079] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.488562] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.488577] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:08.990 [2024-06-10 16:12:14.488590] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.492241] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.492292] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.493915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.493980] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.494270] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.494284] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.495115] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.495163] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.496545] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.496593] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.496879] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.496894] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.496905] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.499429] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.499477] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.499895] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.499938] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.500257] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.500272] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.501668] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.501716] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.503366] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.503412] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.503698] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.503713] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.503725] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.507171] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.508074] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.508500] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.508920] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.509425] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.509442] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.509866] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.509915] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.510351] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.510780] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.511268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.511287] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.511300] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.514134] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.514560] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.514985] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.515405] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.515777] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.515792] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.516232] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.516656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.517086] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.517508] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.517999] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.518015] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.518027] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.520901] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.521334] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.521762] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.522194] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.522715] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.522731] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.523165] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.252 [2024-06-10 16:12:14.523586] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.524018] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.524454] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.524903] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.524918] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.524929] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.528017] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.528467] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.528888] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.529314] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.529815] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.529830] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.530268] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.530696] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.531127] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.531549] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.532061] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.532078] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.532091] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.534878] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.535325] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.535749] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.536185] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.536729] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.536744] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.537187] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.537608] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.538034] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.538464] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.538935] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.538950] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.538969] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.541931] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.542374] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.542798] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.543225] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.543634] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.543650] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.544081] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.544507] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.544932] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.545361] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.545820] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.545836] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.545849] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.548656] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.549088] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.549509] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.549941] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.550422] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.550438] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.550869] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.551310] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.551730] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.552157] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.552618] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.552632] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.552644] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.555568] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.556012] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.556444] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.556871] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.557379] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.557395] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.557823] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.558251] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.558675] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.559116] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.559533] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.559548] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.559559] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.563850] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.564288] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:09.253 [2024-06-10 16:12:14.564679] accel_dpdk_cryptodev.c: 468:accel_dpdk_cryptodev_task_alloc_resources: *ERROR*: Failed to get src_mbufs! 00:33:10.223 00:33:10.223 Latency(us) 00:33:10.223 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:10.223 Job: crypto_ram (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x0 length 0x100 00:33:10.223 crypto_ram : 6.31 40.24 2.51 0.00 0.00 3075182.07 319566.02 2812180.97 00:33:10.223 Job: crypto_ram (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x100 length 0x100 00:33:10.223 crypto_ram : 6.28 40.74 2.55 0.00 0.00 3043702.49 337541.61 2684354.56 00:33:10.223 Job: crypto_ram1 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x0 length 0x100 00:33:10.223 crypto_ram1 : 6.32 40.53 2.53 0.00 0.00 2961177.84 313574.16 2604463.06 00:33:10.223 Job: crypto_ram1 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x100 length 0x100 00:33:10.223 crypto_ram1 : 6.29 40.73 2.55 0.00 0.00 2942211.41 337541.61 2476636.65 00:33:10.223 Job: crypto_ram2 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x0 length 0x100 00:33:10.223 crypto_ram2 : 5.64 238.32 14.89 0.00 0.00 476763.51 51679.82 707039.82 00:33:10.223 Job: crypto_ram2 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x100 length 0x100 00:33:10.223 crypto_ram2 : 5.65 249.06 15.57 0.00 0.00 457875.53 92374.55 691061.52 00:33:10.223 Job: crypto_ram3 (Core Mask 0x1, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x0 length 0x100 00:33:10.223 crypto_ram3 : 5.75 247.89 15.49 0.00 0.00 444587.55 23218.47 369498.21 00:33:10.223 Job: crypto_ram3 (Core Mask 0x2, workload: verify, depth: 128, IO size: 65536) 00:33:10.223 Verification LBA range: start 0x100 length 0x100 00:33:10.223 crypto_ram3 : 5.81 261.46 16.34 0.00 0.00 422125.55 71403.03 535273.08 00:33:10.223 =================================================================================================================== 00:33:10.223 Total : 1158.96 72.44 0.00 0.00 838252.73 23218.47 2812180.97 00:33:10.481 00:33:10.481 real 0m9.365s 00:33:10.481 user 0m17.908s 00:33:10.481 sys 0m0.381s 00:33:10.481 16:12:15 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:10.481 16:12:15 blockdev_crypto_qat.bdev_verify_big_io -- common/autotest_common.sh@10 -- # set +x 00:33:10.481 ************************************ 00:33:10.481 END TEST bdev_verify_big_io 00:33:10.481 ************************************ 00:33:10.482 16:12:15 blockdev_crypto_qat -- bdev/blockdev.sh@779 -- # run_test bdev_write_zeroes /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:10.482 16:12:15 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:33:10.482 16:12:15 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:10.482 16:12:15 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:10.482 ************************************ 00:33:10.482 START TEST bdev_write_zeroes 00:33:10.482 ************************************ 00:33:10.482 16:12:15 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:10.482 [2024-06-10 16:12:15.919530] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:10.482 [2024-06-10 16:12:15.919583] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2885554 ] 00:33:10.740 [2024-06-10 16:12:16.017837] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:10.740 [2024-06-10 16:12:16.108389] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:10.740 [2024-06-10 16:12:16.129684] accel_dpdk_cryptodev.c: 223:accel_dpdk_cryptodev_set_driver: *NOTICE*: Using driver crypto_qat 00:33:10.740 [2024-06-10 16:12:16.137720] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation encrypt will be assigned to module dpdk_cryptodev 00:33:10.740 [2024-06-10 16:12:16.145732] accel_rpc.c: 167:rpc_accel_assign_opc: *NOTICE*: Operation decrypt will be assigned to module dpdk_cryptodev 00:33:10.999 [2024-06-10 16:12:16.252345] accel_dpdk_cryptodev.c:1178:accel_dpdk_cryptodev_init: *NOTICE*: Found crypto devices: 96 00:33:13.529 [2024-06-10 16:12:18.427081] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc" 00:33:13.529 [2024-06-10 16:12:18.427151] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc0 00:33:13.529 [2024-06-10 16:12:18.427163] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.529 [2024-06-10 16:12:18.435101] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts" 00:33:13.529 [2024-06-10 16:12:18.435119] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc1 00:33:13.529 [2024-06-10 16:12:18.435128] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.529 [2024-06-10 16:12:18.443122] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_cbc2" 00:33:13.529 [2024-06-10 16:12:18.443138] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc2 00:33:13.529 [2024-06-10 16:12:18.443147] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.529 [2024-06-10 16:12:18.451142] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "test_dek_qat_xts2" 00:33:13.529 [2024-06-10 16:12:18.451158] bdev.c:8114:bdev_open_ext: *NOTICE*: Currently unable to find bdev with name: Malloc3 00:33:13.529 [2024-06-10 16:12:18.451166] vbdev_crypto.c: 617:create_crypto_disk: *NOTICE*: vbdev creation deferred pending base bdev arrival 00:33:13.529 Running I/O for 1 seconds... 00:33:14.095 00:33:14.095 Latency(us) 00:33:14.095 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:14.095 Job: crypto_ram (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.095 crypto_ram : 1.03 1839.77 7.19 0.00 0.00 68964.94 6116.69 83886.08 00:33:14.095 Job: crypto_ram1 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.095 crypto_ram1 : 1.03 1852.90 7.24 0.00 0.00 68120.05 6085.49 77894.22 00:33:14.095 Job: crypto_ram2 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.095 crypto_ram2 : 1.02 14112.04 55.13 0.00 0.00 8917.57 2683.86 11921.31 00:33:14.095 Job: crypto_ram3 (Core Mask 0x1, workload: write_zeroes, depth: 128, IO size: 4096) 00:33:14.095 crypto_ram3 : 1.02 14144.81 55.25 0.00 0.00 8865.27 2683.86 9299.87 00:33:14.095 =================================================================================================================== 00:33:14.095 Total : 31949.52 124.80 0.00 0.00 15817.82 2683.86 83886.08 00:33:14.661 00:33:14.661 real 0m4.045s 00:33:14.661 user 0m3.704s 00:33:14.661 sys 0m0.297s 00:33:14.661 16:12:19 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:14.661 16:12:19 blockdev_crypto_qat.bdev_write_zeroes -- common/autotest_common.sh@10 -- # set +x 00:33:14.661 ************************************ 00:33:14.661 END TEST bdev_write_zeroes 00:33:14.661 ************************************ 00:33:14.661 16:12:19 blockdev_crypto_qat -- bdev/blockdev.sh@782 -- # run_test bdev_json_nonenclosed /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:14.661 16:12:19 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:33:14.661 16:12:19 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:14.661 16:12:19 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:14.661 ************************************ 00:33:14.661 START TEST bdev_json_nonenclosed 00:33:14.661 ************************************ 00:33:14.661 16:12:19 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonenclosed.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:14.661 [2024-06-10 16:12:20.032141] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:14.661 [2024-06-10 16:12:20.032195] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2886244 ] 00:33:14.661 [2024-06-10 16:12:20.130351] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:14.919 [2024-06-10 16:12:20.223493] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:14.919 [2024-06-10 16:12:20.223564] json_config.c: 608:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: not enclosed in {}. 00:33:14.919 [2024-06-10 16:12:20.223581] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:14.919 [2024-06-10 16:12:20.223590] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:14.919 00:33:14.919 real 0m0.341s 00:33:14.919 user 0m0.213s 00:33:14.919 sys 0m0.125s 00:33:14.919 16:12:20 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:14.919 16:12:20 blockdev_crypto_qat.bdev_json_nonenclosed -- common/autotest_common.sh@10 -- # set +x 00:33:14.919 ************************************ 00:33:14.919 END TEST bdev_json_nonenclosed 00:33:14.919 ************************************ 00:33:14.919 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@785 -- # run_test bdev_json_nonarray /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:14.919 16:12:20 blockdev_crypto_qat -- common/autotest_common.sh@1100 -- # '[' 13 -le 1 ']' 00:33:14.919 16:12:20 blockdev_crypto_qat -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:14.919 16:12:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:14.919 ************************************ 00:33:14.919 START TEST bdev_json_nonarray 00:33:14.919 ************************************ 00:33:14.919 16:12:20 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf --json /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/nonarray.json -q 128 -o 4096 -w write_zeroes -t 1 '' 00:33:15.178 [2024-06-10 16:12:20.441952] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:15.178 [2024-06-10 16:12:20.442014] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2886366 ] 00:33:15.178 [2024-06-10 16:12:20.541575] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:15.178 [2024-06-10 16:12:20.639003] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:15.178 [2024-06-10 16:12:20.639078] json_config.c: 614:json_config_prepare_ctx: *ERROR*: Invalid JSON configuration: 'subsystems' should be an array. 00:33:15.178 [2024-06-10 16:12:20.639096] rpc.c: 190:spdk_rpc_server_finish: *ERROR*: No server listening on provided address: 00:33:15.178 [2024-06-10 16:12:20.639105] app.c:1053:spdk_app_stop: *WARNING*: spdk_app_stop'd on non-zero 00:33:15.436 00:33:15.436 real 0m0.350s 00:33:15.436 user 0m0.228s 00:33:15.436 sys 0m0.119s 00:33:15.436 16:12:20 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:15.436 16:12:20 blockdev_crypto_qat.bdev_json_nonarray -- common/autotest_common.sh@10 -- # set +x 00:33:15.436 ************************************ 00:33:15.436 END TEST bdev_json_nonarray 00:33:15.436 ************************************ 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@787 -- # [[ crypto_qat == bdev ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@794 -- # [[ crypto_qat == gpt ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@798 -- # [[ crypto_qat == crypto_sw ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@810 -- # trap - SIGINT SIGTERM EXIT 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@811 -- # cleanup 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@23 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/aiofile 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@24 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/bdev.json 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@26 -- # [[ crypto_qat == rbd ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@30 -- # [[ crypto_qat == daos ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@34 -- # [[ crypto_qat = \g\p\t ]] 00:33:15.436 16:12:20 blockdev_crypto_qat -- bdev/blockdev.sh@40 -- # [[ crypto_qat == xnvme ]] 00:33:15.436 00:33:15.436 real 1m11.040s 00:33:15.436 user 2m49.758s 00:33:15.436 sys 0m7.156s 00:33:15.436 16:12:20 blockdev_crypto_qat -- common/autotest_common.sh@1125 -- # xtrace_disable 00:33:15.436 16:12:20 blockdev_crypto_qat -- common/autotest_common.sh@10 -- # set +x 00:33:15.436 ************************************ 00:33:15.436 END TEST blockdev_crypto_qat 00:33:15.436 ************************************ 00:33:15.436 16:12:20 -- spdk/autotest.sh@360 -- # run_test chaining /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:15.436 16:12:20 -- common/autotest_common.sh@1100 -- # '[' 2 -le 1 ']' 00:33:15.436 16:12:20 -- common/autotest_common.sh@1106 -- # xtrace_disable 00:33:15.436 16:12:20 -- common/autotest_common.sh@10 -- # set +x 00:33:15.436 ************************************ 00:33:15.436 START TEST chaining 00:33:15.436 ************************************ 00:33:15.437 16:12:20 chaining -- common/autotest_common.sh@1124 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev/chaining.sh 00:33:15.437 * Looking for test storage... 00:33:15.437 * Found test storage at /var/jenkins/workspace/crypto-phy-autotest/spdk/test/bdev 00:33:15.437 16:12:20 chaining -- bdev/chaining.sh@14 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/test/nvmf/common.sh 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@7 -- # uname -s 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@7 -- # [[ Linux == FreeBSD ]] 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@9 -- # NVMF_PORT=4420 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@10 -- # NVMF_SECOND_PORT=4421 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@11 -- # NVMF_THIRD_PORT=4422 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@12 -- # NVMF_IP_PREFIX=192.168.100 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@13 -- # NVMF_IP_LEAST_ADDR=8 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@14 -- # NVMF_TCP_IP_ADDRESS=127.0.0.1 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@15 -- # NVMF_TRANSPORT_OPTS= 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@16 -- # NVMF_SERIAL=SPDKISFASTANDAWESOME 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@17 -- # nvme gen-hostnqn 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@17 -- # NVME_HOSTNQN=nqn.2014-08.org.nvmexpress:uuid:801347e8-3fd0-e911-906e-0017a4403562 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@18 -- # NVME_HOSTID=801347e8-3fd0-e911-906e-0017a4403562 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@19 -- # NVME_HOST=("--hostnqn=$NVME_HOSTNQN" "--hostid=$NVME_HOSTID") 00:33:15.437 16:12:20 chaining -- nvmf/common.sh@20 -- # NVME_CONNECT='nvme connect' 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@21 -- # NET_TYPE=phy-fallback 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@22 -- # NVME_SUBNQN=nqn.2016-06.io.spdk:testnqn 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@45 -- # source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:33:15.695 16:12:20 chaining -- scripts/common.sh@508 -- # [[ -e /bin/wpdk_common.sh ]] 00:33:15.695 16:12:20 chaining -- scripts/common.sh@516 -- # [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:33:15.695 16:12:20 chaining -- scripts/common.sh@517 -- # source /etc/opt/spdk-pkgdep/paths/export.sh 00:33:15.695 16:12:20 chaining -- paths/export.sh@2 -- # PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:15.695 16:12:20 chaining -- paths/export.sh@3 -- # PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:15.695 16:12:20 chaining -- paths/export.sh@4 -- # PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:15.695 16:12:20 chaining -- paths/export.sh@5 -- # export PATH 00:33:15.695 16:12:20 chaining -- paths/export.sh@6 -- # echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@47 -- # : 0 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@48 -- # export NVMF_APP_SHM_ID 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@49 -- # build_nvmf_app_args 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@25 -- # '[' 0 -eq 1 ']' 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@29 -- # NVMF_APP+=(-i "$NVMF_APP_SHM_ID" -e 0xFFFF) 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@31 -- # NVMF_APP+=("${NO_HUGE[@]}") 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@33 -- # '[' -n '' ']' 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@35 -- # '[' 0 -eq 1 ']' 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@51 -- # have_pci_nics=0 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@16 -- # nqn=nqn.2016-06.io.spdk:cnode0 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@17 -- # key0=(00112233445566778899001122334455 11223344556677889900112233445500) 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@18 -- # key1=(22334455667788990011223344550011 33445566778899001122334455001122) 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@19 -- # bperfsock=/var/tmp/bperf.sock 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@20 -- # declare -A stats 00:33:15.695 16:12:20 chaining -- bdev/chaining.sh@66 -- # nvmftestinit 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:15.695 16:12:20 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:15.695 16:12:20 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:33:15.695 16:12:20 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:33:15.695 16:12:20 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@296 -- # e810=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@297 -- # x722=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@298 -- # mlx=() 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:33:22.259 Found 0000:af:00.0 (0x8086 - 0x159b) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:33:22.259 Found 0000:af:00.1 (0x8086 - 0x159b) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:33:22.259 Found net devices under 0000:af:00.0: cvl_0_0 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:33:22.259 Found net devices under 0000:af:00.1: cvl_0_1 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:33:22.259 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:33:22.259 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.166 ms 00:33:22.259 00:33:22.259 --- 10.0.0.2 ping statistics --- 00:33:22.259 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:22.259 rtt min/avg/max/mdev = 0.166/0.166/0.166/0.000 ms 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:33:22.259 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:33:22.259 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.139 ms 00:33:22.259 00:33:22.259 --- 10.0.0.1 ping statistics --- 00:33:22.259 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:22.259 rtt min/avg/max/mdev = 0.139/0.139/0.139/0.000 ms 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@422 -- # return 0 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:33:22.259 16:12:27 chaining -- bdev/chaining.sh@67 -- # nvmfappstart -m 0x2 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:33:22.259 16:12:27 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:33:22.259 16:12:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:22.259 16:12:27 chaining -- nvmf/common.sh@481 -- # nvmfpid=2889913 00:33:22.260 16:12:27 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:33:22.260 16:12:27 chaining -- nvmf/common.sh@482 -- # waitforlisten 2889913 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@830 -- # '[' -z 2889913 ']' 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:22.260 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:22.260 16:12:27 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:22.260 [2024-06-10 16:12:27.650927] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:22.260 [2024-06-10 16:12:27.650996] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:22.260 [2024-06-10 16:12:27.749778] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:22.518 [2024-06-10 16:12:27.849093] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:33:22.518 [2024-06-10 16:12:27.849138] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:33:22.518 [2024-06-10 16:12:27.849148] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:33:22.518 [2024-06-10 16:12:27.849157] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:33:22.518 [2024-06-10 16:12:27.849165] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:33:22.518 [2024-06-10 16:12:27.849189] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:33:23.453 16:12:28 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:23.453 16:12:28 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:23.453 16:12:28 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:33:23.453 16:12:28 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:33:23.453 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.453 16:12:28 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:33:23.453 16:12:28 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@69 -- # input=/tmp/tmp.orpzzKwSPk 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@69 -- # mktemp 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@69 -- # output=/tmp/tmp.Vh2PO13sou 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@70 -- # trap 'tgtcleanup; exit 1' SIGINT SIGTERM EXIT 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@72 -- # rpc_cmd 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.454 malloc0 00:33:23.454 true 00:33:23.454 true 00:33:23.454 [2024-06-10 16:12:28.682926] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:23.454 crypto0 00:33:23.454 [2024-06-10 16:12:28.690953] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:23.454 crypto1 00:33:23.454 [2024-06-10 16:12:28.699074] tcp.c: 724:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:33:23.454 [2024-06-10 16:12:28.715261] tcp.c:1053:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@85 -- # update_stats 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=12 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]= 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:23.454 16:12:28 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@88 -- # dd if=/dev/urandom of=/tmp/tmp.orpzzKwSPk bs=1K count=64 00:33:23.454 64+0 records in 00:33:23.454 64+0 records out 00:33:23.454 65536 bytes (66 kB, 64 KiB) copied, 0.000845644 s, 77.5 MB/s 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@89 -- # spdk_dd --if /tmp/tmp.orpzzKwSPk --ob Nvme0n1 --bs 65536 --count 1 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@25 -- # local config 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:23.454 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:23.454 16:12:28 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:23.712 16:12:28 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:23.712 "subsystems": [ 00:33:23.712 { 00:33:23.712 "subsystem": "bdev", 00:33:23.712 "config": [ 00:33:23.712 { 00:33:23.712 "method": "bdev_nvme_attach_controller", 00:33:23.712 "params": { 00:33:23.712 "trtype": "tcp", 00:33:23.712 "adrfam": "IPv4", 00:33:23.712 "name": "Nvme0", 00:33:23.712 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:23.712 "traddr": "10.0.0.2", 00:33:23.712 "trsvcid": "4420" 00:33:23.712 } 00:33:23.712 }, 00:33:23.712 { 00:33:23.712 "method": "bdev_set_options", 00:33:23.712 "params": { 00:33:23.712 "bdev_auto_examine": false 00:33:23.712 } 00:33:23.712 } 00:33:23.712 ] 00:33:23.712 } 00:33:23.712 ] 00:33:23.712 }' 00:33:23.712 16:12:28 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.orpzzKwSPk --ob Nvme0n1 --bs 65536 --count 1 00:33:23.712 16:12:28 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:23.712 "subsystems": [ 00:33:23.712 { 00:33:23.712 "subsystem": "bdev", 00:33:23.712 "config": [ 00:33:23.712 { 00:33:23.712 "method": "bdev_nvme_attach_controller", 00:33:23.712 "params": { 00:33:23.712 "trtype": "tcp", 00:33:23.712 "adrfam": "IPv4", 00:33:23.712 "name": "Nvme0", 00:33:23.712 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:23.712 "traddr": "10.0.0.2", 00:33:23.712 "trsvcid": "4420" 00:33:23.712 } 00:33:23.712 }, 00:33:23.712 { 00:33:23.712 "method": "bdev_set_options", 00:33:23.712 "params": { 00:33:23.712 "bdev_auto_examine": false 00:33:23.712 } 00:33:23.712 } 00:33:23.712 ] 00:33:23.712 } 00:33:23.712 ] 00:33:23.712 }' 00:33:23.712 [2024-06-10 16:12:29.024290] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:23.712 [2024-06-10 16:12:29.024330] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890170 ] 00:33:23.712 [2024-06-10 16:12:29.111173] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:23.712 [2024-06-10 16:12:29.202602] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:24.278  Copying: 64/64 [kB] (average 15 MBps) 00:33:24.278 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@90 -- # get_stat sequence_executed 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:24.278 16:12:29 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:24.278 16:12:29 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.278 16:12:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.278 16:12:29 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@90 -- # (( 13 == stats[sequence_executed] + 1 )) 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@91 -- # get_stat executed encrypt 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@91 -- # (( 2 == stats[encrypt_executed] + 2 )) 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@92 -- # get_stat executed decrypt 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@92 -- # (( 12 == stats[decrypt_executed] )) 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@95 -- # get_stat executed copy 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@95 -- # (( 4 == stats[copy_executed] )) 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@96 -- # update_stats 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.537 16:12:29 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.537 16:12:29 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=13 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.537 16:12:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:24.537 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.537 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.537 16:12:30 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=2 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=12 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:24.796 16:12:30 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@99 -- # spdk_dd --of /tmp/tmp.Vh2PO13sou --ib Nvme0n1 --bs 65536 --count 1 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@25 -- # local config 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:24.796 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:24.796 "subsystems": [ 00:33:24.796 { 00:33:24.796 "subsystem": "bdev", 00:33:24.796 "config": [ 00:33:24.796 { 00:33:24.796 "method": "bdev_nvme_attach_controller", 00:33:24.796 "params": { 00:33:24.796 "trtype": "tcp", 00:33:24.796 "adrfam": "IPv4", 00:33:24.796 "name": "Nvme0", 00:33:24.796 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:24.796 "traddr": "10.0.0.2", 00:33:24.796 "trsvcid": "4420" 00:33:24.796 } 00:33:24.796 }, 00:33:24.796 { 00:33:24.796 "method": "bdev_set_options", 00:33:24.796 "params": { 00:33:24.796 "bdev_auto_examine": false 00:33:24.796 } 00:33:24.796 } 00:33:24.796 ] 00:33:24.796 } 00:33:24.796 ] 00:33:24.796 }' 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.Vh2PO13sou --ib Nvme0n1 --bs 65536 --count 1 00:33:24.796 16:12:30 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:24.796 "subsystems": [ 00:33:24.796 { 00:33:24.796 "subsystem": "bdev", 00:33:24.796 "config": [ 00:33:24.796 { 00:33:24.796 "method": "bdev_nvme_attach_controller", 00:33:24.796 "params": { 00:33:24.796 "trtype": "tcp", 00:33:24.796 "adrfam": "IPv4", 00:33:24.796 "name": "Nvme0", 00:33:24.796 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:24.796 "traddr": "10.0.0.2", 00:33:24.796 "trsvcid": "4420" 00:33:24.796 } 00:33:24.796 }, 00:33:24.796 { 00:33:24.796 "method": "bdev_set_options", 00:33:24.796 "params": { 00:33:24.796 "bdev_auto_examine": false 00:33:24.796 } 00:33:24.796 } 00:33:24.796 ] 00:33:24.796 } 00:33:24.796 ] 00:33:24.796 }' 00:33:24.796 [2024-06-10 16:12:30.249112] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:24.796 [2024-06-10 16:12:30.249172] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890431 ] 00:33:25.054 [2024-06-10 16:12:30.350177] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:25.054 [2024-06-10 16:12:30.440392] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:25.571  Copying: 64/64 [kB] (average 20 MBps) 00:33:25.571 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@100 -- # get_stat sequence_executed 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@100 -- # (( 14 == stats[sequence_executed] + 1 )) 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@101 -- # get_stat executed encrypt 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@101 -- # (( 2 == stats[encrypt_executed] )) 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@102 -- # get_stat executed decrypt 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:25.571 16:12:30 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:25.571 16:12:30 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:25.571 16:12:31 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@102 -- # (( 14 == stats[decrypt_executed] + 2 )) 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@103 -- # get_stat executed copy 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:25.571 16:12:31 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:25.571 16:12:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:25.571 16:12:31 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@103 -- # (( 4 == stats[copy_executed] )) 00:33:25.571 16:12:31 chaining -- bdev/chaining.sh@104 -- # cmp /tmp/tmp.orpzzKwSPk /tmp/tmp.Vh2PO13sou 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@105 -- # spdk_dd --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@25 -- # local config 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:25.830 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:25.830 "subsystems": [ 00:33:25.830 { 00:33:25.830 "subsystem": "bdev", 00:33:25.830 "config": [ 00:33:25.830 { 00:33:25.830 "method": "bdev_nvme_attach_controller", 00:33:25.830 "params": { 00:33:25.830 "trtype": "tcp", 00:33:25.830 "adrfam": "IPv4", 00:33:25.830 "name": "Nvme0", 00:33:25.830 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:25.830 "traddr": "10.0.0.2", 00:33:25.830 "trsvcid": "4420" 00:33:25.830 } 00:33:25.830 }, 00:33:25.830 { 00:33:25.830 "method": "bdev_set_options", 00:33:25.830 "params": { 00:33:25.830 "bdev_auto_examine": false 00:33:25.830 } 00:33:25.830 } 00:33:25.830 ] 00:33:25.830 } 00:33:25.830 ] 00:33:25.830 }' 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /dev/zero --ob Nvme0n1 --bs 65536 --count 1 00:33:25.830 16:12:31 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:25.830 "subsystems": [ 00:33:25.830 { 00:33:25.830 "subsystem": "bdev", 00:33:25.830 "config": [ 00:33:25.830 { 00:33:25.830 "method": "bdev_nvme_attach_controller", 00:33:25.830 "params": { 00:33:25.830 "trtype": "tcp", 00:33:25.830 "adrfam": "IPv4", 00:33:25.830 "name": "Nvme0", 00:33:25.830 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:25.830 "traddr": "10.0.0.2", 00:33:25.830 "trsvcid": "4420" 00:33:25.830 } 00:33:25.830 }, 00:33:25.830 { 00:33:25.830 "method": "bdev_set_options", 00:33:25.830 "params": { 00:33:25.830 "bdev_auto_examine": false 00:33:25.830 } 00:33:25.830 } 00:33:25.830 ] 00:33:25.830 } 00:33:25.830 ] 00:33:25.830 }' 00:33:25.830 [2024-06-10 16:12:31.183983] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:25.830 [2024-06-10 16:12:31.184045] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890675 ] 00:33:25.830 [2024-06-10 16:12:31.283473] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:26.088 [2024-06-10 16:12:31.370991] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:26.605  Copying: 64/64 [kB] (average 62 MBps) 00:33:26.605 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@106 -- # update_stats 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=15 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=4 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:26.605 16:12:31 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:26.605 16:12:31 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:26.605 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:26.605 16:12:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:26.605 16:12:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:26.605 16:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:26.605 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:26.606 16:12:32 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:26.606 16:12:32 chaining -- bdev/chaining.sh@109 -- # spdk_dd --if /tmp/tmp.orpzzKwSPk --ob Nvme0n1 --bs 4096 --count 16 00:33:26.606 16:12:32 chaining -- bdev/chaining.sh@25 -- # local config 00:33:26.606 16:12:32 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:26.606 16:12:32 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:26.606 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:26.863 16:12:32 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:26.863 "subsystems": [ 00:33:26.863 { 00:33:26.863 "subsystem": "bdev", 00:33:26.863 "config": [ 00:33:26.863 { 00:33:26.863 "method": "bdev_nvme_attach_controller", 00:33:26.863 "params": { 00:33:26.863 "trtype": "tcp", 00:33:26.863 "adrfam": "IPv4", 00:33:26.863 "name": "Nvme0", 00:33:26.863 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:26.863 "traddr": "10.0.0.2", 00:33:26.863 "trsvcid": "4420" 00:33:26.863 } 00:33:26.863 }, 00:33:26.863 { 00:33:26.863 "method": "bdev_set_options", 00:33:26.863 "params": { 00:33:26.863 "bdev_auto_examine": false 00:33:26.863 } 00:33:26.863 } 00:33:26.863 ] 00:33:26.863 } 00:33:26.863 ] 00:33:26.863 }' 00:33:26.863 16:12:32 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --if /tmp/tmp.orpzzKwSPk --ob Nvme0n1 --bs 4096 --count 16 00:33:26.863 16:12:32 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:26.863 "subsystems": [ 00:33:26.863 { 00:33:26.863 "subsystem": "bdev", 00:33:26.863 "config": [ 00:33:26.863 { 00:33:26.863 "method": "bdev_nvme_attach_controller", 00:33:26.863 "params": { 00:33:26.863 "trtype": "tcp", 00:33:26.863 "adrfam": "IPv4", 00:33:26.863 "name": "Nvme0", 00:33:26.863 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:26.863 "traddr": "10.0.0.2", 00:33:26.863 "trsvcid": "4420" 00:33:26.863 } 00:33:26.863 }, 00:33:26.863 { 00:33:26.863 "method": "bdev_set_options", 00:33:26.863 "params": { 00:33:26.863 "bdev_auto_examine": false 00:33:26.863 } 00:33:26.863 } 00:33:26.863 ] 00:33:26.863 } 00:33:26.863 ] 00:33:26.863 }' 00:33:26.863 [2024-06-10 16:12:32.192496] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:26.863 [2024-06-10 16:12:32.192552] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890830 ] 00:33:26.863 [2024-06-10 16:12:32.294244] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:27.121 [2024-06-10 16:12:32.386580] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:27.380  Copying: 64/64 [kB] (average 9142 kBps) 00:33:27.380 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@110 -- # get_stat sequence_executed 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@110 -- # (( 31 == stats[sequence_executed] + 16 )) 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@111 -- # get_stat executed encrypt 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.380 16:12:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:27.380 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@111 -- # (( 36 == stats[encrypt_executed] + 32 )) 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@112 -- # get_stat executed decrypt 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@112 -- # (( 14 == stats[decrypt_executed] )) 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@113 -- # get_stat executed copy 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.638 16:12:32 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.638 16:12:32 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@113 -- # (( 4 == stats[copy_executed] )) 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@114 -- # update_stats 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@51 -- # get_stat sequence_executed 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@51 -- # stats["sequence_executed"]=31 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@52 -- # get_stat executed encrypt 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@52 -- # stats["encrypt_executed"]=36 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@53 -- # get_stat executed decrypt 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.638 16:12:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.638 16:12:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@53 -- # stats["decrypt_executed"]=14 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@54 -- # get_stat executed copy 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:27.896 16:12:33 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:27.896 16:12:33 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:27.896 16:12:33 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@54 -- # stats["copy_executed"]=4 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@117 -- # : 00:33:27.896 16:12:33 chaining -- bdev/chaining.sh@118 -- # spdk_dd --of /tmp/tmp.Vh2PO13sou --ib Nvme0n1 --bs 4096 --count 16 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@25 -- # local config 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@31 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/gen_nvme.sh --mode=remote --json-with-subsystems --trid=tcp:10.0.0.2:4420:nqn.2016-06.io.spdk:cnode0 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@32 -- # jq '.subsystems[0].config[.subsystems[0].config | length] |= 00:33:27.897 {"method": "bdev_set_options", "params": {"bdev_auto_examine": false}}' 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@31 -- # config='{ 00:33:27.897 "subsystems": [ 00:33:27.897 { 00:33:27.897 "subsystem": "bdev", 00:33:27.897 "config": [ 00:33:27.897 { 00:33:27.897 "method": "bdev_nvme_attach_controller", 00:33:27.897 "params": { 00:33:27.897 "trtype": "tcp", 00:33:27.897 "adrfam": "IPv4", 00:33:27.897 "name": "Nvme0", 00:33:27.897 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:27.897 "traddr": "10.0.0.2", 00:33:27.897 "trsvcid": "4420" 00:33:27.897 } 00:33:27.897 }, 00:33:27.897 { 00:33:27.897 "method": "bdev_set_options", 00:33:27.897 "params": { 00:33:27.897 "bdev_auto_examine": false 00:33:27.897 } 00:33:27.897 } 00:33:27.897 ] 00:33:27.897 } 00:33:27.897 ] 00:33:27.897 }' 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@33 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/spdk_dd -c /dev/fd/62 --of /tmp/tmp.Vh2PO13sou --ib Nvme0n1 --bs 4096 --count 16 00:33:27.897 16:12:33 chaining -- bdev/chaining.sh@33 -- # echo '{ 00:33:27.897 "subsystems": [ 00:33:27.897 { 00:33:27.897 "subsystem": "bdev", 00:33:27.897 "config": [ 00:33:27.897 { 00:33:27.897 "method": "bdev_nvme_attach_controller", 00:33:27.897 "params": { 00:33:27.897 "trtype": "tcp", 00:33:27.897 "adrfam": "IPv4", 00:33:27.897 "name": "Nvme0", 00:33:27.897 "subnqn": "nqn.2016-06.io.spdk:cnode0", 00:33:27.897 "traddr": "10.0.0.2", 00:33:27.897 "trsvcid": "4420" 00:33:27.897 } 00:33:27.897 }, 00:33:27.897 { 00:33:27.897 "method": "bdev_set_options", 00:33:27.897 "params": { 00:33:27.897 "bdev_auto_examine": false 00:33:27.897 } 00:33:27.897 } 00:33:27.897 ] 00:33:27.897 } 00:33:27.897 ] 00:33:27.897 }' 00:33:27.897 [2024-06-10 16:12:33.305467] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:27.897 [2024-06-10 16:12:33.305523] [ DPDK EAL parameters: spdk_dd --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2890972 ] 00:33:28.155 [2024-06-10 16:12:33.406981] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:28.155 [2024-06-10 16:12:33.494976] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:28.670  Copying: 64/64 [kB] (average 496 kBps) 00:33:28.670 00:33:28.670 16:12:34 chaining -- bdev/chaining.sh@119 -- # get_stat sequence_executed 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@41 -- # rpc_cmd accel_get_stats 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@119 -- # (( 47 == stats[sequence_executed] + 16 )) 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@120 -- # get_stat executed encrypt 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@120 -- # (( 36 == stats[encrypt_executed] )) 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@121 -- # get_stat executed decrypt 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@121 -- # (( 46 == stats[decrypt_executed] + 32 )) 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@122 -- # get_stat executed copy 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # opcode=copy 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_cmd 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@40 -- # [[ -z copy ]] 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "copy").executed' 00:33:28.671 16:12:34 chaining -- bdev/chaining.sh@43 -- # rpc_cmd accel_get_stats 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:28.671 16:12:34 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@122 -- # (( 4 == stats[copy_executed] )) 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@123 -- # cmp /tmp/tmp.orpzzKwSPk /tmp/tmp.Vh2PO13sou 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@125 -- # trap - SIGINT SIGTERM EXIT 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@126 -- # tgtcleanup 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@58 -- # rm -f /tmp/tmp.orpzzKwSPk /tmp/tmp.Vh2PO13sou 00:33:28.929 16:12:34 chaining -- bdev/chaining.sh@59 -- # nvmftestfini 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@117 -- # sync 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@120 -- # set +e 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:33:28.929 rmmod nvme_tcp 00:33:28.929 rmmod nvme_fabrics 00:33:28.929 rmmod nvme_keyring 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@124 -- # set -e 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@125 -- # return 0 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@489 -- # '[' -n 2889913 ']' 00:33:28.929 16:12:34 chaining -- nvmf/common.sh@490 -- # killprocess 2889913 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@949 -- # '[' -z 2889913 ']' 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@953 -- # kill -0 2889913 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@954 -- # uname 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2889913 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2889913' 00:33:28.929 killing process with pid 2889913 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@968 -- # kill 2889913 00:33:28.929 16:12:34 chaining -- common/autotest_common.sh@973 -- # wait 2889913 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:33:29.188 16:12:34 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:29.188 16:12:34 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:29.188 16:12:34 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:31.091 16:12:36 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:33:31.091 16:12:36 chaining -- bdev/chaining.sh@129 -- # trap 'bperfcleanup; exit 1' SIGINT SIGTERM EXIT 00:33:31.091 16:12:36 chaining -- bdev/chaining.sh@132 -- # bperfpid=2891689 00:33:31.091 16:12:36 chaining -- bdev/chaining.sh@134 -- # waitforlisten 2891689 00:33:31.091 16:12:36 chaining -- bdev/chaining.sh@131 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@830 -- # '[' -z 2891689 ']' 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:31.091 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:31.091 16:12:36 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:31.350 [2024-06-10 16:12:36.651819] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:31.350 [2024-06-10 16:12:36.651877] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2891689 ] 00:33:31.350 [2024-06-10 16:12:36.751007] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:31.350 [2024-06-10 16:12:36.846523] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:32.285 16:12:37 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:32.285 16:12:37 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:32.285 16:12:37 chaining -- bdev/chaining.sh@135 -- # rpc_cmd 00:33:32.285 16:12:37 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:32.285 16:12:37 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:32.285 malloc0 00:33:32.285 true 00:33:32.285 true 00:33:32.285 [2024-06-10 16:12:37.753059] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:32.285 crypto0 00:33:32.285 [2024-06-10 16:12:37.761085] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:32.285 crypto1 00:33:32.285 16:12:37 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:32.285 16:12:37 chaining -- bdev/chaining.sh@145 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:32.544 Running I/O for 5 seconds... 00:33:37.846 00:33:37.846 Latency(us) 00:33:37.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:37.846 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:33:37.846 Verification LBA range: start 0x0 length 0x2000 00:33:37.846 crypto1 : 5.01 10493.02 40.99 0.00 0.00 24316.30 3776.12 15603.81 00:33:37.846 =================================================================================================================== 00:33:37.846 Total : 10493.02 40.99 0.00 0.00 24316.30 3776.12 15603.81 00:33:37.846 0 00:33:37.846 16:12:42 chaining -- bdev/chaining.sh@146 -- # killprocess 2891689 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@949 -- # '[' -z 2891689 ']' 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@953 -- # kill -0 2891689 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@954 -- # uname 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2891689 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2891689' 00:33:37.846 killing process with pid 2891689 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@968 -- # kill 2891689 00:33:37.846 Received shutdown signal, test time was about 5.000000 seconds 00:33:37.846 00:33:37.846 Latency(us) 00:33:37.846 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:37.846 =================================================================================================================== 00:33:37.846 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:37.846 16:12:42 chaining -- common/autotest_common.sh@973 -- # wait 2891689 00:33:37.846 16:12:43 chaining -- bdev/chaining.sh@152 -- # bperfpid=2892616 00:33:37.846 16:12:43 chaining -- bdev/chaining.sh@151 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:33:37.846 16:12:43 chaining -- bdev/chaining.sh@154 -- # waitforlisten 2892616 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@830 -- # '[' -z 2892616 ']' 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:37.846 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:37.846 16:12:43 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:37.846 [2024-06-10 16:12:43.227761] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:37.846 [2024-06-10 16:12:43.227819] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2892616 ] 00:33:37.846 [2024-06-10 16:12:43.328016] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:38.105 [2024-06-10 16:12:43.423772] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:38.672 16:12:44 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:38.672 16:12:44 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:38.672 16:12:44 chaining -- bdev/chaining.sh@155 -- # rpc_cmd 00:33:38.672 16:12:44 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:38.672 16:12:44 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:38.930 malloc0 00:33:38.930 true 00:33:38.930 true 00:33:38.930 [2024-06-10 16:12:44.319033] vbdev_passthru.c: 607:vbdev_passthru_register: *NOTICE*: Match on malloc0 00:33:38.930 [2024-06-10 16:12:44.319079] vbdev_passthru.c: 635:vbdev_passthru_register: *NOTICE*: base bdev opened 00:33:38.930 [2024-06-10 16:12:44.319097] vbdev_passthru.c: 680:vbdev_passthru_register: *NOTICE*: io_device created at: 0x0x13c6290 00:33:38.930 [2024-06-10 16:12:44.319107] vbdev_passthru.c: 695:vbdev_passthru_register: *NOTICE*: bdev claimed 00:33:38.931 [2024-06-10 16:12:44.320223] vbdev_passthru.c: 708:vbdev_passthru_register: *NOTICE*: pt_bdev registered 00:33:38.931 [2024-06-10 16:12:44.320247] vbdev_passthru.c: 709:vbdev_passthru_register: *NOTICE*: created pt_bdev for: pt0 00:33:38.931 pt0 00:33:38.931 [2024-06-10 16:12:44.327064] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:38.931 crypto0 00:33:38.931 [2024-06-10 16:12:44.335083] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key1" 00:33:38.931 crypto1 00:33:38.931 16:12:44 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:38.931 16:12:44 chaining -- bdev/chaining.sh@166 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py perform_tests 00:33:39.189 Running I/O for 5 seconds... 00:33:44.467 00:33:44.467 Latency(us) 00:33:44.467 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.467 Job: crypto1 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:33:44.467 Verification LBA range: start 0x0 length 0x2000 00:33:44.467 crypto1 : 5.02 8314.03 32.48 0.00 0.00 30702.99 7084.13 18474.91 00:33:44.467 =================================================================================================================== 00:33:44.467 Total : 8314.03 32.48 0.00 0.00 30702.99 7084.13 18474.91 00:33:44.467 0 00:33:44.467 16:12:49 chaining -- bdev/chaining.sh@167 -- # killprocess 2892616 00:33:44.467 16:12:49 chaining -- common/autotest_common.sh@949 -- # '[' -z 2892616 ']' 00:33:44.467 16:12:49 chaining -- common/autotest_common.sh@953 -- # kill -0 2892616 00:33:44.467 16:12:49 chaining -- common/autotest_common.sh@954 -- # uname 00:33:44.467 16:12:49 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:44.467 16:12:49 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2892616 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2892616' 00:33:44.468 killing process with pid 2892616 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@968 -- # kill 2892616 00:33:44.468 Received shutdown signal, test time was about 5.000000 seconds 00:33:44.468 00:33:44.468 Latency(us) 00:33:44.468 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:44.468 =================================================================================================================== 00:33:44.468 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@973 -- # wait 2892616 00:33:44.468 16:12:49 chaining -- bdev/chaining.sh@169 -- # trap - SIGINT SIGTERM EXIT 00:33:44.468 16:12:49 chaining -- bdev/chaining.sh@170 -- # killprocess 2892616 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@949 -- # '[' -z 2892616 ']' 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@953 -- # kill -0 2892616 00:33:44.468 /var/jenkins/workspace/crypto-phy-autotest/spdk/test/common/autotest_common.sh: line 953: kill: (2892616) - No such process 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@976 -- # echo 'Process with pid 2892616 is not found' 00:33:44.468 Process with pid 2892616 is not found 00:33:44.468 16:12:49 chaining -- bdev/chaining.sh@171 -- # wait 2892616 00:33:44.468 16:12:49 chaining -- bdev/chaining.sh@175 -- # nvmftestinit 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@441 -- # '[' -z tcp ']' 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@446 -- # trap nvmftestfini SIGINT SIGTERM EXIT 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@448 -- # prepare_net_devs 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@410 -- # local -g is_hw=no 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@412 -- # remove_spdk_ns 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@414 -- # [[ phy-fallback != virt ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@414 -- # gather_supported_nvmf_pci_devs 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@285 -- # xtrace_disable 00:33:44.468 16:12:49 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@289 -- # local intel=0x8086 mellanox=0x15b3 pci net_dev 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@291 -- # pci_devs=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@291 -- # local -a pci_devs 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@292 -- # pci_net_devs=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@292 -- # local -a pci_net_devs 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@293 -- # pci_drivers=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@293 -- # local -A pci_drivers 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@295 -- # net_devs=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@295 -- # local -ga net_devs 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@296 -- # e810=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@296 -- # local -ga e810 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@297 -- # x722=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@297 -- # local -ga x722 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@298 -- # mlx=() 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@298 -- # local -ga mlx 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@301 -- # e810+=(${pci_bus_cache["$intel:0x1592"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@302 -- # e810+=(${pci_bus_cache["$intel:0x159b"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@304 -- # x722+=(${pci_bus_cache["$intel:0x37d2"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@306 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2dc"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@308 -- # mlx+=(${pci_bus_cache["$mellanox:0x1021"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@310 -- # mlx+=(${pci_bus_cache["$mellanox:0xa2d6"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@312 -- # mlx+=(${pci_bus_cache["$mellanox:0x101d"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@314 -- # mlx+=(${pci_bus_cache["$mellanox:0x1017"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@315 -- # mlx+=(${pci_bus_cache["$mellanox:0x1019"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@317 -- # mlx+=(${pci_bus_cache["$mellanox:0x1015"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@318 -- # mlx+=(${pci_bus_cache["$mellanox:0x1013"]}) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@320 -- # pci_devs+=("${e810[@]}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@321 -- # [[ tcp == rdma ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@327 -- # [[ '' == mlx5 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@329 -- # [[ '' == e810 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@331 -- # [[ '' == x722 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@335 -- # (( 2 == 0 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.0 (0x8086 - 0x159b)' 00:33:44.468 Found 0000:af:00.0 (0x8086 - 0x159b) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@340 -- # for pci in "${pci_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@341 -- # echo 'Found 0000:af:00.1 (0x8086 - 0x159b)' 00:33:44.468 Found 0000:af:00.1 (0x8086 - 0x159b) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@342 -- # [[ ice == unknown ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@346 -- # [[ ice == unbound ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@350 -- # [[ 0x159b == \0\x\1\0\1\7 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@351 -- # [[ 0x159b == \0\x\1\0\1\9 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@352 -- # [[ tcp == rdma ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@366 -- # (( 0 > 0 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@372 -- # [[ '' == e810 ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.0: cvl_0_0' 00:33:44.468 Found net devices under 0000:af:00.0: cvl_0_0 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@382 -- # for pci in "${pci_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@383 -- # pci_net_devs=("/sys/bus/pci/devices/$pci/net/"*) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@388 -- # [[ tcp == tcp ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@389 -- # for net_dev in "${!pci_net_devs[@]}" 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@390 -- # [[ up == up ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@394 -- # (( 1 == 0 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@399 -- # pci_net_devs=("${pci_net_devs[@]##*/}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@400 -- # echo 'Found net devices under 0000:af:00.1: cvl_0_1' 00:33:44.468 Found net devices under 0000:af:00.1: cvl_0_1 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@401 -- # net_devs+=("${pci_net_devs[@]}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@404 -- # (( 2 == 0 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@414 -- # is_hw=yes 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@416 -- # [[ yes == yes ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@417 -- # [[ tcp == tcp ]] 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@418 -- # nvmf_tcp_init 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@229 -- # NVMF_INITIATOR_IP=10.0.0.1 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@230 -- # NVMF_FIRST_TARGET_IP=10.0.0.2 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@231 -- # TCP_INTERFACE_LIST=("${net_devs[@]}") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@234 -- # (( 2 > 1 )) 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@236 -- # NVMF_TARGET_INTERFACE=cvl_0_0 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@237 -- # NVMF_INITIATOR_INTERFACE=cvl_0_1 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@240 -- # NVMF_SECOND_TARGET_IP= 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@242 -- # NVMF_TARGET_NAMESPACE=cvl_0_0_ns_spdk 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@243 -- # NVMF_TARGET_NS_CMD=(ip netns exec "$NVMF_TARGET_NAMESPACE") 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@244 -- # ip -4 addr flush cvl_0_0 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@245 -- # ip -4 addr flush cvl_0_1 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@248 -- # ip netns add cvl_0_0_ns_spdk 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@251 -- # ip link set cvl_0_0 netns cvl_0_0_ns_spdk 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@254 -- # ip addr add 10.0.0.1/24 dev cvl_0_1 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@255 -- # ip netns exec cvl_0_0_ns_spdk ip addr add 10.0.0.2/24 dev cvl_0_0 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@258 -- # ip link set cvl_0_1 up 00:33:44.468 16:12:49 chaining -- nvmf/common.sh@260 -- # ip netns exec cvl_0_0_ns_spdk ip link set cvl_0_0 up 00:33:44.726 16:12:49 chaining -- nvmf/common.sh@261 -- # ip netns exec cvl_0_0_ns_spdk ip link set lo up 00:33:44.726 16:12:49 chaining -- nvmf/common.sh@264 -- # iptables -I INPUT 1 -i cvl_0_1 -p tcp --dport 4420 -j ACCEPT 00:33:44.726 16:12:49 chaining -- nvmf/common.sh@267 -- # ping -c 1 10.0.0.2 00:33:44.726 PING 10.0.0.2 (10.0.0.2) 56(84) bytes of data. 00:33:44.726 64 bytes from 10.0.0.2: icmp_seq=1 ttl=64 time=0.194 ms 00:33:44.726 00:33:44.726 --- 10.0.0.2 ping statistics --- 00:33:44.726 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:44.726 rtt min/avg/max/mdev = 0.194/0.194/0.194/0.000 ms 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@268 -- # ip netns exec cvl_0_0_ns_spdk ping -c 1 10.0.0.1 00:33:44.726 PING 10.0.0.1 (10.0.0.1) 56(84) bytes of data. 00:33:44.726 64 bytes from 10.0.0.1: icmp_seq=1 ttl=64 time=0.138 ms 00:33:44.726 00:33:44.726 --- 10.0.0.1 ping statistics --- 00:33:44.726 1 packets transmitted, 1 received, 0% packet loss, time 0ms 00:33:44.726 rtt min/avg/max/mdev = 0.138/0.138/0.138/0.000 ms 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@270 -- # NVMF_APP=("${NVMF_TARGET_NS_CMD[@]}" "${NVMF_APP[@]}") 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@422 -- # return 0 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@450 -- # '[' '' == iso ']' 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@454 -- # NVMF_TRANSPORT_OPTS='-t tcp' 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@455 -- # [[ tcp == \r\d\m\a ]] 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@464 -- # [[ tcp == \t\c\p ]] 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@465 -- # NVMF_TRANSPORT_OPTS='-t tcp -o' 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@468 -- # '[' tcp == tcp ']' 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@474 -- # modprobe nvme-tcp 00:33:44.726 16:12:50 chaining -- bdev/chaining.sh@176 -- # nvmfappstart -m 0x2 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@479 -- # timing_enter start_nvmf_tgt 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@723 -- # xtrace_disable 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@481 -- # nvmfpid=2893784 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@482 -- # waitforlisten 2893784 00:33:44.726 16:12:50 chaining -- nvmf/common.sh@480 -- # ip netns exec cvl_0_0_ns_spdk ip netns exec cvl_0_0_ns_spdk /var/jenkins/workspace/crypto-phy-autotest/spdk/build/bin/nvmf_tgt -i 0 -e 0xFFFF -m 0x2 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@830 -- # '[' -z 2893784 ']' 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/spdk.sock 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock...' 00:33:44.726 Waiting for process to start up and listen on UNIX domain socket /var/tmp/spdk.sock... 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:44.726 16:12:50 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:44.726 [2024-06-10 16:12:50.124612] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:44.727 [2024-06-10 16:12:50.124671] [ DPDK EAL parameters: nvmf -c 0x2 --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk0 --proc-type=auto ] 00:33:44.727 [2024-06-10 16:12:50.222307] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:44.984 [2024-06-10 16:12:50.317204] app.c: 604:app_setup_trace: *NOTICE*: Tracepoint Group Mask 0xFFFF specified. 00:33:44.984 [2024-06-10 16:12:50.317245] app.c: 605:app_setup_trace: *NOTICE*: Use 'spdk_trace -s nvmf -i 0' to capture a snapshot of events at runtime. 00:33:44.984 [2024-06-10 16:12:50.317255] app.c: 610:app_setup_trace: *NOTICE*: 'spdk_trace' without parameters will also work if this is the only 00:33:44.984 [2024-06-10 16:12:50.317264] app.c: 611:app_setup_trace: *NOTICE*: SPDK application currently running. 00:33:44.984 [2024-06-10 16:12:50.317276] app.c: 612:app_setup_trace: *NOTICE*: Or copy /dev/shm/nvmf_trace.0 for offline analysis/debug. 00:33:44.984 [2024-06-10 16:12:50.317298] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 1 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:45.916 16:12:51 chaining -- nvmf/common.sh@483 -- # timing_exit start_nvmf_tgt 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@729 -- # xtrace_disable 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:45.916 16:12:51 chaining -- nvmf/common.sh@484 -- # trap 'process_shm --id $NVMF_APP_SHM_ID || :; nvmftestfini' SIGINT SIGTERM EXIT 00:33:45.916 16:12:51 chaining -- bdev/chaining.sh@178 -- # rpc_cmd 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@560 -- # xtrace_disable 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:45.916 malloc0 00:33:45.916 [2024-06-10 16:12:51.127830] tcp.c: 724:nvmf_tcp_create: *NOTICE*: *** TCP Transport Init *** 00:33:45.916 [2024-06-10 16:12:51.144049] tcp.c:1053:nvmf_tcp_listen: *NOTICE*: *** NVMe/TCP Target Listening on 10.0.0.2 port 4420 *** 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@588 -- # [[ 0 == 0 ]] 00:33:45.916 16:12:51 chaining -- bdev/chaining.sh@186 -- # trap 'bperfcleanup || :; nvmftestfini || :; exit 1' SIGINT SIGTERM EXIT 00:33:45.916 16:12:51 chaining -- bdev/chaining.sh@189 -- # bperfpid=2894025 00:33:45.916 16:12:51 chaining -- bdev/chaining.sh@191 -- # waitforlisten 2894025 /var/tmp/bperf.sock 00:33:45.916 16:12:51 chaining -- bdev/chaining.sh@187 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 4096 -q 256 --wait-for-rpc -z 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@830 -- # '[' -z 2894025 ']' 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:33:45.916 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:45.916 16:12:51 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:45.916 [2024-06-10 16:12:51.214465] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:45.916 [2024-06-10 16:12:51.214520] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2894025 ] 00:33:45.916 [2024-06-10 16:12:51.311116] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:45.916 [2024-06-10 16:12:51.405002] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:46.849 16:12:52 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:46.849 16:12:52 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:46.849 16:12:52 chaining -- bdev/chaining.sh@192 -- # rpc_bperf 00:33:46.849 16:12:52 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:33:47.107 [2024-06-10 16:12:52.507881] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:47.107 nvme0n1 00:33:47.107 true 00:33:47.107 crypto0 00:33:47.107 16:12:52 chaining -- bdev/chaining.sh@201 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:33:47.107 Running I/O for 5 seconds... 00:33:52.378 00:33:52.378 Latency(us) 00:33:52.378 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:52.378 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 256, IO size: 4096) 00:33:52.378 Verification LBA range: start 0x0 length 0x2000 00:33:52.378 crypto0 : 5.02 7790.87 30.43 0.00 0.00 32746.88 4244.24 27712.37 00:33:52.379 =================================================================================================================== 00:33:52.379 Total : 7790.87 30.43 0.00 0.00 32746.88 4244.24 27712.37 00:33:52.379 0 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@205 -- # get_stat_bperf sequence_executed 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@39 -- # opcode= 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:33:52.379 16:12:57 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@205 -- # sequence=78244 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@206 -- # get_stat_bperf executed encrypt 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:33:52.639 16:12:57 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@206 -- # encrypt=39122 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@207 -- # get_stat_bperf executed decrypt 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:33:52.639 16:12:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:33:52.897 16:12:58 chaining -- bdev/chaining.sh@207 -- # decrypt=39122 00:33:52.897 16:12:58 chaining -- bdev/chaining.sh@208 -- # get_stat_bperf executed crc32c 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@39 -- # event=executed 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:33:52.898 16:12:58 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:33:53.156 16:12:58 chaining -- bdev/chaining.sh@208 -- # crc32c=78244 00:33:53.156 16:12:58 chaining -- bdev/chaining.sh@210 -- # (( sequence > 0 )) 00:33:53.156 16:12:58 chaining -- bdev/chaining.sh@211 -- # (( encrypt + decrypt == sequence )) 00:33:53.156 16:12:58 chaining -- bdev/chaining.sh@212 -- # (( encrypt + decrypt == crc32c )) 00:33:53.156 16:12:58 chaining -- bdev/chaining.sh@214 -- # killprocess 2894025 00:33:53.156 16:12:58 chaining -- common/autotest_common.sh@949 -- # '[' -z 2894025 ']' 00:33:53.156 16:12:58 chaining -- common/autotest_common.sh@953 -- # kill -0 2894025 00:33:53.156 16:12:58 chaining -- common/autotest_common.sh@954 -- # uname 00:33:53.156 16:12:58 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:33:53.156 16:12:58 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2894025 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2894025' 00:33:53.415 killing process with pid 2894025 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@968 -- # kill 2894025 00:33:53.415 Received shutdown signal, test time was about 5.000000 seconds 00:33:53.415 00:33:53.415 Latency(us) 00:33:53.415 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:33:53.415 =================================================================================================================== 00:33:53.415 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@973 -- # wait 2894025 00:33:53.415 16:12:58 chaining -- bdev/chaining.sh@219 -- # bperfpid=2895175 00:33:53.415 16:12:58 chaining -- bdev/chaining.sh@221 -- # waitforlisten 2895175 /var/tmp/bperf.sock 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@830 -- # '[' -z 2895175 ']' 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@834 -- # local rpc_addr=/var/tmp/bperf.sock 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@835 -- # local max_retries=100 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@837 -- # echo 'Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock...' 00:33:53.415 Waiting for process to start up and listen on UNIX domain socket /var/tmp/bperf.sock... 00:33:53.415 16:12:58 chaining -- bdev/chaining.sh@217 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/build/examples/bdevperf -r /var/tmp/bperf.sock -t 5 -w verify -o 65536 -q 32 --wait-for-rpc -z 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@839 -- # xtrace_disable 00:33:53.415 16:12:58 chaining -- common/autotest_common.sh@10 -- # set +x 00:33:53.673 [2024-06-10 16:12:58.954746] Starting SPDK v24.09-pre git sha1 8d1bffc3d / DPDK 24.03.0 initialization... 00:33:53.674 [2024-06-10 16:12:58.954805] [ DPDK EAL parameters: bdevperf --no-shconf -c 0x1 --huge-unlink --no-telemetry --log-level=lib.eal:6 --log-level=lib.cryptodev:5 --log-level=user1:6 --base-virtaddr=0x200000000000 --match-allocations --file-prefix=spdk_pid2895175 ] 00:33:53.674 [2024-06-10 16:12:59.055341] app.c: 909:spdk_app_start: *NOTICE*: Total cores available: 1 00:33:53.674 [2024-06-10 16:12:59.148571] reactor.c: 929:reactor_run: *NOTICE*: Reactor started on core 0 00:33:54.609 16:12:59 chaining -- common/autotest_common.sh@859 -- # (( i == 0 )) 00:33:54.609 16:12:59 chaining -- common/autotest_common.sh@863 -- # return 0 00:33:54.609 16:12:59 chaining -- bdev/chaining.sh@222 -- # rpc_bperf 00:33:54.609 16:12:59 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock 00:33:54.867 [2024-06-10 16:13:00.315661] vbdev_crypto_rpc.c: 115:rpc_bdev_crypto_create: *NOTICE*: Found key "key0" 00:33:54.867 nvme0n1 00:33:54.867 true 00:33:54.867 crypto0 00:33:54.867 16:13:00 chaining -- bdev/chaining.sh@231 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/examples/bdev/bdevperf/bdevperf.py -s /var/tmp/bperf.sock perform_tests 00:33:55.126 Running I/O for 5 seconds... 00:34:00.395 00:34:00.395 Latency(us) 00:34:00.395 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:00.395 Job: crypto0 (Core Mask 0x1, workload: verify, depth: 32, IO size: 65536) 00:34:00.395 Verification LBA range: start 0x0 length 0x200 00:34:00.395 crypto0 : 5.01 1600.70 100.04 0.00 0.00 19577.57 1209.30 20846.69 00:34:00.395 =================================================================================================================== 00:34:00.395 Total : 1600.70 100.04 0.00 0.00 19577.57 1209.30 20846.69 00:34:00.395 0 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@233 -- # get_stat_bperf sequence_executed 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@48 -- # get_stat sequence_executed '' rpc_bperf 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # event=sequence_executed 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # opcode= 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@40 -- # [[ -z '' ]] 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@41 -- # rpc_bperf accel_get_stats 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@41 -- # jq -r .sequence_executed 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@233 -- # sequence=16028 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@234 -- # get_stat_bperf executed encrypt 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed encrypt rpc_bperf 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # opcode=encrypt 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@40 -- # [[ -z encrypt ]] 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "encrypt").executed' 00:34:00.395 16:13:05 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:00.654 16:13:05 chaining -- bdev/chaining.sh@234 -- # encrypt=8014 00:34:00.655 16:13:05 chaining -- bdev/chaining.sh@235 -- # get_stat_bperf executed decrypt 00:34:00.655 16:13:05 chaining -- bdev/chaining.sh@48 -- # get_stat executed decrypt rpc_bperf 00:34:00.655 16:13:05 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@39 -- # opcode=decrypt 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@40 -- # [[ -z decrypt ]] 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "decrypt").executed' 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:00.655 16:13:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@235 -- # decrypt=8014 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@236 -- # get_stat_bperf executed crc32c 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@48 -- # get_stat executed crc32c rpc_bperf 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@37 -- # local event opcode rpc 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@39 -- # event=executed 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@39 -- # opcode=crc32c 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@39 -- # rpc=rpc_bperf 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@40 -- # [[ -z crc32c ]] 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@43 -- # rpc_bperf accel_get_stats 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@44 -- # jq -r '.operations[] | select(.opcode == "crc32c").executed' 00:34:00.950 16:13:06 chaining -- bdev/chaining.sh@22 -- # /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/rpc.py -s /var/tmp/bperf.sock accel_get_stats 00:34:01.209 16:13:06 chaining -- bdev/chaining.sh@236 -- # crc32c=16028 00:34:01.209 16:13:06 chaining -- bdev/chaining.sh@238 -- # (( sequence > 0 )) 00:34:01.209 16:13:06 chaining -- bdev/chaining.sh@239 -- # (( encrypt + decrypt == sequence )) 00:34:01.209 16:13:06 chaining -- bdev/chaining.sh@240 -- # (( encrypt + decrypt == crc32c )) 00:34:01.209 16:13:06 chaining -- bdev/chaining.sh@242 -- # killprocess 2895175 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@949 -- # '[' -z 2895175 ']' 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@953 -- # kill -0 2895175 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@954 -- # uname 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2895175 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_0 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@959 -- # '[' reactor_0 = sudo ']' 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2895175' 00:34:01.209 killing process with pid 2895175 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@968 -- # kill 2895175 00:34:01.209 Received shutdown signal, test time was about 5.000000 seconds 00:34:01.209 00:34:01.209 Latency(us) 00:34:01.209 Device Information : runtime(s) IOPS MiB/s Fail/s TO/s Average min max 00:34:01.209 =================================================================================================================== 00:34:01.209 Total : 0.00 0.00 0.00 0.00 0.00 0.00 0.00 00:34:01.209 16:13:06 chaining -- common/autotest_common.sh@973 -- # wait 2895175 00:34:01.467 16:13:06 chaining -- bdev/chaining.sh@243 -- # nvmftestfini 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@488 -- # nvmfcleanup 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@117 -- # sync 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@119 -- # '[' tcp == tcp ']' 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@120 -- # set +e 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@121 -- # for i in {1..20} 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@122 -- # modprobe -v -r nvme-tcp 00:34:01.467 rmmod nvme_tcp 00:34:01.467 rmmod nvme_fabrics 00:34:01.467 rmmod nvme_keyring 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@123 -- # modprobe -v -r nvme-fabrics 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@124 -- # set -e 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@125 -- # return 0 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@489 -- # '[' -n 2893784 ']' 00:34:01.467 16:13:06 chaining -- nvmf/common.sh@490 -- # killprocess 2893784 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@949 -- # '[' -z 2893784 ']' 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@953 -- # kill -0 2893784 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@954 -- # uname 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@954 -- # '[' Linux = Linux ']' 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@955 -- # ps --no-headers -o comm= 2893784 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@955 -- # process_name=reactor_1 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@959 -- # '[' reactor_1 = sudo ']' 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@967 -- # echo 'killing process with pid 2893784' 00:34:01.467 killing process with pid 2893784 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@968 -- # kill 2893784 00:34:01.467 16:13:06 chaining -- common/autotest_common.sh@973 -- # wait 2893784 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@492 -- # '[' '' == iso ']' 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@495 -- # [[ tcp == \t\c\p ]] 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@496 -- # nvmf_tcp_fini 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@274 -- # [[ cvl_0_0_ns_spdk == \n\v\m\f\_\t\g\t\_\n\s ]] 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@278 -- # remove_spdk_ns 00:34:01.726 16:13:07 chaining -- nvmf/common.sh@628 -- # xtrace_disable_per_cmd _remove_spdk_ns 00:34:01.726 16:13:07 chaining -- common/autotest_common.sh@22 -- # eval '_remove_spdk_ns 13> /dev/null' 00:34:01.726 16:13:07 chaining -- common/autotest_common.sh@22 -- # _remove_spdk_ns 00:34:04.258 16:13:09 chaining -- nvmf/common.sh@279 -- # ip -4 addr flush cvl_0_1 00:34:04.258 16:13:09 chaining -- bdev/chaining.sh@245 -- # trap - SIGINT SIGTERM EXIT 00:34:04.258 00:34:04.258 real 0m48.319s 00:34:04.258 user 1m1.212s 00:34:04.258 sys 0m10.500s 00:34:04.258 16:13:09 chaining -- common/autotest_common.sh@1125 -- # xtrace_disable 00:34:04.258 16:13:09 chaining -- common/autotest_common.sh@10 -- # set +x 00:34:04.258 ************************************ 00:34:04.258 END TEST chaining 00:34:04.258 ************************************ 00:34:04.258 16:13:09 -- spdk/autotest.sh@363 -- # [[ 0 -eq 1 ]] 00:34:04.258 16:13:09 -- spdk/autotest.sh@367 -- # [[ 0 -eq 1 ]] 00:34:04.258 16:13:09 -- spdk/autotest.sh@371 -- # [[ 0 -eq 1 ]] 00:34:04.258 16:13:09 -- spdk/autotest.sh@375 -- # [[ 0 -eq 1 ]] 00:34:04.258 16:13:09 -- spdk/autotest.sh@380 -- # trap - SIGINT SIGTERM EXIT 00:34:04.258 16:13:09 -- spdk/autotest.sh@382 -- # timing_enter post_cleanup 00:34:04.258 16:13:09 -- common/autotest_common.sh@723 -- # xtrace_disable 00:34:04.258 16:13:09 -- common/autotest_common.sh@10 -- # set +x 00:34:04.258 16:13:09 -- spdk/autotest.sh@383 -- # autotest_cleanup 00:34:04.258 16:13:09 -- common/autotest_common.sh@1391 -- # local autotest_es=0 00:34:04.258 16:13:09 -- common/autotest_common.sh@1392 -- # xtrace_disable 00:34:04.258 16:13:09 -- common/autotest_common.sh@10 -- # set +x 00:34:08.455 INFO: APP EXITING 00:34:08.455 INFO: killing all VMs 00:34:08.455 INFO: killing vhost app 00:34:08.455 INFO: EXIT DONE 00:34:10.989 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:34:11.247 Waiting for block devices as requested 00:34:11.247 0000:5e:00.0 (8086 0a54): vfio-pci -> nvme 00:34:11.505 0000:00:04.7 (8086 2021): vfio-pci -> ioatdma 00:34:11.505 0000:00:04.6 (8086 2021): vfio-pci -> ioatdma 00:34:11.505 0000:00:04.5 (8086 2021): vfio-pci -> ioatdma 00:34:11.763 0000:00:04.4 (8086 2021): vfio-pci -> ioatdma 00:34:11.763 0000:00:04.3 (8086 2021): vfio-pci -> ioatdma 00:34:11.763 0000:00:04.2 (8086 2021): vfio-pci -> ioatdma 00:34:12.022 0000:00:04.1 (8086 2021): vfio-pci -> ioatdma 00:34:12.022 0000:00:04.0 (8086 2021): vfio-pci -> ioatdma 00:34:12.022 0000:80:04.7 (8086 2021): vfio-pci -> ioatdma 00:34:12.022 0000:80:04.6 (8086 2021): vfio-pci -> ioatdma 00:34:12.279 0000:80:04.5 (8086 2021): vfio-pci -> ioatdma 00:34:12.279 0000:80:04.4 (8086 2021): vfio-pci -> ioatdma 00:34:12.279 0000:80:04.3 (8086 2021): vfio-pci -> ioatdma 00:34:12.538 0000:80:04.2 (8086 2021): vfio-pci -> ioatdma 00:34:12.538 0000:80:04.1 (8086 2021): vfio-pci -> ioatdma 00:34:12.538 0000:80:04.0 (8086 2021): vfio-pci -> ioatdma 00:34:15.825 0000:5f:00.0 (1b96 2600): Skipping denied controller at 0000:5f:00.0 00:34:16.083 Cleaning 00:34:16.083 Removing: /var/run/dpdk/spdk0/config 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-0 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-1 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-2 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-0-3 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-0 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-1 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-2 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memseg-2048k-1-3 00:34:16.083 Removing: /var/run/dpdk/spdk0/fbarray_memzone 00:34:16.083 Removing: /var/run/dpdk/spdk0/hugepage_info 00:34:16.083 Removing: /dev/shm/nvmf_trace.0 00:34:16.083 Removing: /dev/shm/spdk_tgt_trace.pid2583201 00:34:16.083 Removing: /var/run/dpdk/spdk0 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2579619 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2582143 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2583201 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2583825 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2584760 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2584991 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2585945 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2586175 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2586510 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2589399 00:34:16.083 Removing: /var/run/dpdk/spdk_pid2591457 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2591731 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2592418 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2592923 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2593202 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2593455 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2593700 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2593972 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2594653 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2597738 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2597989 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2598348 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2598684 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2598711 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2598947 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2599239 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2599485 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2599735 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2599983 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2600230 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2600479 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2600725 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2600974 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2601222 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2601531 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2601854 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2602170 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2602416 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2602668 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2602917 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2603165 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2603413 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2603667 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2603911 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2604169 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2604580 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2604869 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2605226 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2605586 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2605888 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2606304 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2606565 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2607024 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2607100 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2607615 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2608003 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2608443 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2608686 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2613395 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2615384 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2617554 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2618533 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2619962 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2620340 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2620363 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2620596 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2625760 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2626452 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2627602 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2627942 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2634502 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2636540 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2637667 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2642871 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2644913 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2646122 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2651027 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2654074 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2655303 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2667793 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2670588 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2671827 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2683811 00:34:16.342 Removing: /var/run/dpdk/spdk_pid2686380 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2687617 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2700088 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2704036 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2705314 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2718506 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2721724 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2722958 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2736946 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2740002 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2741414 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2754953 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2760204 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2761533 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2762765 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2766629 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2772969 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2776154 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2781831 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2785932 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2793009 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2796429 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2803925 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2806949 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2814436 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2817278 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2825458 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2828271 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2833618 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2834001 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2834420 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2834875 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2835414 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2836269 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2837070 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2837522 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2839369 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2841421 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2843419 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2845031 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2846987 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2849048 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2851107 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2852698 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2853680 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2854376 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2856838 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2859143 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2861359 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2862675 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2864190 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2864795 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2864970 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2865041 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2865341 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2865531 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2866916 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2868796 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2870723 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2871648 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2872609 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2873026 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2873055 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2873076 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2874063 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2874745 00:34:16.600 Removing: /var/run/dpdk/spdk_pid2875258 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2877730 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2880122 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2882283 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2883772 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2885554 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2886244 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2886366 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2890170 00:34:16.601 Removing: /var/run/dpdk/spdk_pid2890431 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2890675 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2890830 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2890972 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2891689 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2892616 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2894025 00:34:16.859 Removing: /var/run/dpdk/spdk_pid2895175 00:34:16.859 Clean 00:34:16.859 16:13:22 -- common/autotest_common.sh@1450 -- # return 0 00:34:16.859 16:13:22 -- spdk/autotest.sh@384 -- # timing_exit post_cleanup 00:34:16.859 16:13:22 -- common/autotest_common.sh@729 -- # xtrace_disable 00:34:16.859 16:13:22 -- common/autotest_common.sh@10 -- # set +x 00:34:16.859 16:13:22 -- spdk/autotest.sh@386 -- # timing_exit autotest 00:34:16.859 16:13:22 -- common/autotest_common.sh@729 -- # xtrace_disable 00:34:16.859 16:13:22 -- common/autotest_common.sh@10 -- # set +x 00:34:16.859 16:13:22 -- spdk/autotest.sh@387 -- # chmod a+r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:34:16.859 16:13:22 -- spdk/autotest.sh@389 -- # [[ -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log ]] 00:34:16.859 16:13:22 -- spdk/autotest.sh@389 -- # rm -f /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/udev.log 00:34:16.859 16:13:22 -- spdk/autotest.sh@391 -- # hash lcov 00:34:16.859 16:13:22 -- spdk/autotest.sh@391 -- # [[ CC_TYPE=gcc == *\c\l\a\n\g* ]] 00:34:16.859 16:13:22 -- spdk/autotest.sh@393 -- # hostname 00:34:16.859 16:13:22 -- spdk/autotest.sh@393 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -c -d /var/jenkins/workspace/crypto-phy-autotest/spdk -t spdk-wfp-03 -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info 00:34:17.117 geninfo: WARNING: invalid characters removed from testname! 00:34:49.193 16:13:50 -- spdk/autotest.sh@394 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_base.info -a /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_test.info -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:34:49.194 16:13:54 -- spdk/autotest.sh@395 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/dpdk/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:34:51.764 16:13:57 -- spdk/autotest.sh@396 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '/usr/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:34:55.049 16:14:00 -- spdk/autotest.sh@397 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/examples/vmd/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:34:57.580 16:14:02 -- spdk/autotest.sh@398 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_lspci/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:00.866 16:14:05 -- spdk/autotest.sh@399 -- # lcov --rc lcov_branch_coverage=1 --rc lcov_function_coverage=1 --rc genhtml_branch_coverage=1 --rc genhtml_function_coverage=1 --rc genhtml_legend=1 --rc geninfo_all_blocks=1 --no-external -q -r /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info '*/app/spdk_top/*' -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/cov_total.info 00:35:03.399 16:14:08 -- spdk/autotest.sh@400 -- # rm -f cov_base.info cov_test.info OLD_STDOUT OLD_STDERR 00:35:03.399 16:14:08 -- common/autobuild_common.sh@15 -- $ source /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/common.sh 00:35:03.399 16:14:08 -- scripts/common.sh@508 -- $ [[ -e /bin/wpdk_common.sh ]] 00:35:03.399 16:14:08 -- scripts/common.sh@516 -- $ [[ -e /etc/opt/spdk-pkgdep/paths/export.sh ]] 00:35:03.399 16:14:08 -- scripts/common.sh@517 -- $ source /etc/opt/spdk-pkgdep/paths/export.sh 00:35:03.399 16:14:08 -- paths/export.sh@2 -- $ PATH=/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.399 16:14:08 -- paths/export.sh@3 -- $ PATH=/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.399 16:14:08 -- paths/export.sh@4 -- $ PATH=/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.399 16:14:08 -- paths/export.sh@5 -- $ export PATH 00:35:03.399 16:14:08 -- paths/export.sh@6 -- $ echo /opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/opt/protoc/21.7/bin:/opt/go/1.21.1/bin:/opt/golangci/1.54.2/bin:/home/sys_sgci/.local/bin:/home/sys_sgci/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/bin:/usr/local/sbin:/var/spdk/dependencies/pip/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/sbin:/bin:/var/lib/snapd/snap/bin 00:35:03.400 16:14:08 -- common/autobuild_common.sh@436 -- $ out=/var/jenkins/workspace/crypto-phy-autotest/spdk/../output 00:35:03.400 16:14:08 -- common/autobuild_common.sh@437 -- $ date +%s 00:35:03.400 16:14:08 -- common/autobuild_common.sh@437 -- $ mktemp -dt spdk_1718028848.XXXXXX 00:35:03.400 16:14:08 -- common/autobuild_common.sh@437 -- $ SPDK_WORKSPACE=/tmp/spdk_1718028848.Ytq8PH 00:35:03.400 16:14:08 -- common/autobuild_common.sh@439 -- $ [[ -n '' ]] 00:35:03.400 16:14:08 -- common/autobuild_common.sh@443 -- $ '[' -n '' ']' 00:35:03.400 16:14:08 -- common/autobuild_common.sh@446 -- $ scanbuild_exclude='--exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/' 00:35:03.400 16:14:08 -- common/autobuild_common.sh@450 -- $ scanbuild_exclude+=' --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp' 00:35:03.400 16:14:08 -- common/autobuild_common.sh@452 -- $ scanbuild='scan-build -o /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/scan-build-tmp --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/dpdk/ --exclude /var/jenkins/workspace/crypto-phy-autotest/spdk/xnvme --exclude /tmp --status-bugs' 00:35:03.400 16:14:08 -- common/autobuild_common.sh@453 -- $ get_config_params 00:35:03.400 16:14:08 -- common/autotest_common.sh@396 -- $ xtrace_disable 00:35:03.400 16:14:08 -- common/autotest_common.sh@10 -- $ set +x 00:35:03.400 16:14:08 -- common/autobuild_common.sh@453 -- $ config_params='--enable-debug --enable-werror --with-rdma --with-idxd --with-fio=/usr/src/fio --with-iscsi-initiator --disable-unit-tests --with-vbdev-compress --with-dpdk-compressdev --with-crypto --enable-ubsan --enable-coverage --with-ublk' 00:35:03.400 16:14:08 -- common/autobuild_common.sh@455 -- $ start_monitor_resources 00:35:03.400 16:14:08 -- pm/common@17 -- $ local monitor 00:35:03.400 16:14:08 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:03.400 16:14:08 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:03.400 16:14:08 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:03.400 16:14:08 -- pm/common@21 -- $ date +%s 00:35:03.400 16:14:08 -- pm/common@19 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:03.400 16:14:08 -- pm/common@21 -- $ date +%s 00:35:03.400 16:14:08 -- pm/common@25 -- $ sleep 1 00:35:03.400 16:14:08 -- pm/common@21 -- $ date +%s 00:35:03.400 16:14:08 -- pm/common@21 -- $ date +%s 00:35:03.400 16:14:08 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-load -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718028848 00:35:03.400 16:14:08 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-vmstat -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718028848 00:35:03.400 16:14:08 -- pm/common@21 -- $ /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-cpu-temp -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718028848 00:35:03.400 16:14:08 -- pm/common@21 -- $ sudo -E /var/jenkins/workspace/crypto-phy-autotest/spdk/scripts/perf/pm/collect-bmc-pm -d /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power -l -p monitor.autopackage.sh.1718028848 00:35:03.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718028848_collect-vmstat.pm.log 00:35:03.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718028848_collect-cpu-load.pm.log 00:35:03.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718028848_collect-cpu-temp.pm.log 00:35:03.400 Redirecting to /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/monitor.autopackage.sh.1718028848_collect-bmc-pm.bmc.pm.log 00:35:04.338 16:14:09 -- common/autobuild_common.sh@456 -- $ trap stop_monitor_resources EXIT 00:35:04.338 16:14:09 -- spdk/autopackage.sh@10 -- $ MAKEFLAGS=-j96 00:35:04.338 16:14:09 -- spdk/autopackage.sh@11 -- $ cd /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:04.338 16:14:09 -- spdk/autopackage.sh@13 -- $ [[ 0 -eq 1 ]] 00:35:04.338 16:14:09 -- spdk/autopackage.sh@18 -- $ [[ 0 -eq 0 ]] 00:35:04.338 16:14:09 -- spdk/autopackage.sh@19 -- $ timing_finish 00:35:04.338 16:14:09 -- common/autotest_common.sh@735 -- $ flamegraph=/usr/local/FlameGraph/flamegraph.pl 00:35:04.338 16:14:09 -- common/autotest_common.sh@736 -- $ '[' -x /usr/local/FlameGraph/flamegraph.pl ']' 00:35:04.338 16:14:09 -- common/autotest_common.sh@738 -- $ /usr/local/FlameGraph/flamegraph.pl --title 'Build Timing' --nametype Step: --countname seconds /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/timing.txt 00:35:04.338 16:14:09 -- spdk/autopackage.sh@20 -- $ exit 0 00:35:04.338 16:14:09 -- spdk/autopackage.sh@1 -- $ stop_monitor_resources 00:35:04.338 16:14:09 -- pm/common@29 -- $ signal_monitor_resources TERM 00:35:04.338 16:14:09 -- pm/common@40 -- $ local monitor pid pids signal=TERM 00:35:04.338 16:14:09 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:04.338 16:14:09 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-load.pid ]] 00:35:04.338 16:14:09 -- pm/common@44 -- $ pid=2907048 00:35:04.338 16:14:09 -- pm/common@50 -- $ kill -TERM 2907048 00:35:04.338 16:14:09 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:04.338 16:14:09 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-vmstat.pid ]] 00:35:04.338 16:14:09 -- pm/common@44 -- $ pid=2907050 00:35:04.338 16:14:09 -- pm/common@50 -- $ kill -TERM 2907050 00:35:04.338 16:14:09 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:04.338 16:14:09 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-cpu-temp.pid ]] 00:35:04.338 16:14:09 -- pm/common@44 -- $ pid=2907051 00:35:04.338 16:14:09 -- pm/common@50 -- $ kill -TERM 2907051 00:35:04.338 16:14:09 -- pm/common@42 -- $ for monitor in "${MONITOR_RESOURCES[@]}" 00:35:04.338 16:14:09 -- pm/common@43 -- $ [[ -e /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/collect-bmc-pm.pid ]] 00:35:04.338 16:14:09 -- pm/common@44 -- $ pid=2907076 00:35:04.338 16:14:09 -- pm/common@50 -- $ sudo -E kill -TERM 2907076 00:35:04.338 + [[ -n 2458086 ]] 00:35:04.338 + sudo kill 2458086 00:35:04.607 [Pipeline] } 00:35:04.624 [Pipeline] // stage 00:35:04.629 [Pipeline] } 00:35:04.646 [Pipeline] // timeout 00:35:04.650 [Pipeline] } 00:35:04.667 [Pipeline] // catchError 00:35:04.672 [Pipeline] } 00:35:04.687 [Pipeline] // wrap 00:35:04.693 [Pipeline] } 00:35:04.708 [Pipeline] // catchError 00:35:04.716 [Pipeline] stage 00:35:04.718 [Pipeline] { (Epilogue) 00:35:04.731 [Pipeline] catchError 00:35:04.733 [Pipeline] { 00:35:04.747 [Pipeline] echo 00:35:04.749 Cleanup processes 00:35:04.754 [Pipeline] sh 00:35:05.037 + sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:05.037 2907160 /usr/bin/ipmitool sdr dump /var/jenkins/workspace/crypto-phy-autotest/spdk/../output/power/sdr.cache 00:35:05.037 2907444 sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:05.053 [Pipeline] sh 00:35:05.336 ++ sudo pgrep -af /var/jenkins/workspace/crypto-phy-autotest/spdk 00:35:05.337 ++ grep -v 'sudo pgrep' 00:35:05.337 ++ awk '{print $1}' 00:35:05.337 + sudo kill -9 2907160 00:35:05.349 [Pipeline] sh 00:35:05.632 + jbp/jenkins/jjb-config/jobs/scripts/compress_artifacts.sh 00:35:20.562 [Pipeline] sh 00:35:20.844 + jbp/jenkins/jjb-config/jobs/scripts/check_artifacts_size.sh 00:35:20.844 Artifacts sizes are good 00:35:20.857 [Pipeline] archiveArtifacts 00:35:20.863 Archiving artifacts 00:35:21.042 [Pipeline] sh 00:35:21.326 + sudo chown -R sys_sgci /var/jenkins/workspace/crypto-phy-autotest 00:35:21.340 [Pipeline] cleanWs 00:35:21.349 [WS-CLEANUP] Deleting project workspace... 00:35:21.349 [WS-CLEANUP] Deferred wipeout is used... 00:35:21.355 [WS-CLEANUP] done 00:35:21.357 [Pipeline] } 00:35:21.372 [Pipeline] // catchError 00:35:21.382 [Pipeline] sh 00:35:21.662 + logger -p user.info -t JENKINS-CI 00:35:21.670 [Pipeline] } 00:35:21.685 [Pipeline] // stage 00:35:21.690 [Pipeline] } 00:35:21.705 [Pipeline] // node 00:35:21.710 [Pipeline] End of Pipeline 00:35:21.731 Finished: SUCCESS